Apr 16 13:08:52.754847 ip-10-0-142-166 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:08:52.754861 ip-10-0-142-166 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:08:52.754870 ip-10-0-142-166 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:08:52.755166 ip-10-0-142-166 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:09:02.759872 ip-10-0-142-166 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:09:02.759888 ip-10-0-142-166 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 160a132753d3403eab4f3fc158be2d29 -- Apr 16 13:11:25.286413 ip-10-0-142-166 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:11:25.782517 ip-10-0-142-166 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:25.782517 ip-10-0-142-166 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:11:25.782517 ip-10-0-142-166 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:25.782517 ip-10-0-142-166 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:11:25.782517 ip-10-0-142-166 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:25.784270 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.784183 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:11:25.790274 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790257 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:25.790274 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790273 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:25.790274 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790276 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790280 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790283 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790286 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790289 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790292 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790295 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790298 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790300 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790303 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790305 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790310 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790314 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790317 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790321 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790331 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790335 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790338 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790341 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:25.790369 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790344 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790346 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790349 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790351 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790354 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790356 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790359 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790362 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790364 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790367 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790370 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790372 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790375 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790378 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790381 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790384 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790386 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790389 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790392 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790394 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:25.790827 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790397 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790399 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790402 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790405 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790407 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790415 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790418 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790422 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790425 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790428 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790430 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790433 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790436 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790438 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790443 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790448 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790451 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790455 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790458 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790461 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:25.791454 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790464 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790467 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790469 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790472 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790475 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790477 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790479 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790482 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790485 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790488 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790490 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790493 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790495 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790498 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790501 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790504 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790507 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790509 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790512 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790516 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:25.791948 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790519 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790521 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790524 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790526 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790534 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790934 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790939 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790942 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790945 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790948 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790951 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790953 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790956 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790958 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790961 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790964 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790966 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790969 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790971 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790974 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:25.792444 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790977 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790979 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790982 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790985 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790987 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790990 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790993 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790996 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.790998 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791001 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791004 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791007 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791010 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791012 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791015 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791017 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791020 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791023 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791026 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791028 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:25.792921 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791031 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791034 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791036 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791039 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791044 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791047 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791050 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791054 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791057 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791059 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791062 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791065 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791068 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791071 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791073 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791076 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791079 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791081 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791084 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:25.793450 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791087 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791089 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791092 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791094 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791097 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791100 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791102 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791105 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791109 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791112 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791114 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791129 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791132 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791135 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791137 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791140 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791143 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791145 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791148 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791151 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:25.793916 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791155 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791159 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791161 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791164 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791167 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791169 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791172 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791174 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791177 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791180 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791183 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.791185 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792108 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792132 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792139 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792144 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792148 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792152 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792157 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792162 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792165 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792168 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:11:25.794426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792172 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792175 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792178 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792181 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792183 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792186 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792189 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792191 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792194 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792199 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792202 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792205 2575 flags.go:64] FLAG: --config-dir="" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792208 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792211 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792215 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792218 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792221 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792224 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792228 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792231 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792234 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792237 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792240 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792244 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792249 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:11:25.794971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792266 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792270 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792274 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792278 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792282 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792286 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792289 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792292 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792295 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792299 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792302 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792305 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792308 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792311 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792314 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792317 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792320 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792323 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792326 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792329 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792333 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792336 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792339 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792342 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792345 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:11:25.795611 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792349 2575 flags.go:64] FLAG: --help="false" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792353 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-142-166.ec2.internal" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792356 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792359 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792362 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792366 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792369 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792373 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792376 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792379 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792382 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792385 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792388 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792391 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792393 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792396 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792399 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792402 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792405 2575 flags.go:64] FLAG: --lock-file="" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792407 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792411 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792414 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792419 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:11:25.796263 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792422 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792425 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792427 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792430 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792434 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792437 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792439 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792444 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792447 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792453 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792456 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792460 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792462 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792466 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792469 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792472 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792475 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792483 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792486 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792489 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792493 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792495 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792501 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792504 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:11:25.796840 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792507 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792510 2575 flags.go:64] FLAG: --port="10250" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792513 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792516 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03cd3de3c45e86080" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792519 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792523 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792526 2575 flags.go:64] FLAG: --register-node="true" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792528 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792531 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792535 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792538 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792542 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792545 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792548 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792551 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792554 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792557 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792560 2575 flags.go:64] FLAG: --runonce="false" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792563 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792568 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792571 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792575 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792577 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792581 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792584 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792587 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:11:25.797477 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792590 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792593 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792596 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792599 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792602 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792606 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792609 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792614 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792617 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792619 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792624 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792626 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792629 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792632 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792635 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792640 2575 flags.go:64] FLAG: --v="2" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792644 2575 flags.go:64] FLAG: --version="false" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792648 2575 flags.go:64] FLAG: --vmodule="" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792653 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.792656 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792754 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792758 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792761 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792764 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:25.798171 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792766 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792771 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792774 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792777 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792780 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792783 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792786 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792789 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792791 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792794 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792797 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792799 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792802 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792805 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792807 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792810 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792812 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792815 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792817 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792820 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:25.798740 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792823 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792825 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792828 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792832 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792834 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792837 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792839 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792842 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792845 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792847 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792850 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792852 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792855 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792859 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792861 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792864 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792866 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792869 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792872 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792875 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:25.799315 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792877 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792880 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792883 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792886 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792888 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792891 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792894 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792896 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792899 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792902 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792905 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792907 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792910 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792914 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792917 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792921 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792925 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792928 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792931 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:25.799813 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792933 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792937 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792939 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792942 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792945 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792947 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792951 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792954 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792957 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792959 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792962 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792965 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792967 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792969 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792972 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792975 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792977 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792980 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792984 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:25.800302 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792987 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792990 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792993 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.792996 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.793959 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.800417 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.800433 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800481 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800486 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800491 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800495 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800498 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800501 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800504 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800506 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:25.800778 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800509 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800512 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800514 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800517 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800520 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800522 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800525 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800527 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800530 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800533 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800536 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800538 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800541 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800543 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800547 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800550 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800553 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800555 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800558 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800561 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:25.801197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800563 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800566 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800568 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800571 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800574 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800576 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800579 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800582 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800585 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800587 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800590 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800593 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800596 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800598 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800601 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800603 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800606 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800608 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800611 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:25.801725 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800613 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800616 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800618 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800621 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800623 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800626 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800628 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800631 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800635 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800638 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800640 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800643 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800646 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800648 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800651 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800654 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800657 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800659 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800662 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800665 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:25.802210 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800668 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800670 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800674 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800678 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800681 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800684 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800687 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800689 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800692 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800694 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800697 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800700 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800702 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800704 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800707 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800710 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800712 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800714 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:25.802703 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800717 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.800722 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800821 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800826 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800829 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800832 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800834 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800837 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800840 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800843 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800845 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800848 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800851 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800855 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800858 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:25.803166 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800861 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800863 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800866 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800868 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800870 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800873 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800875 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800878 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800882 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800885 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800888 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800891 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800894 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800897 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800899 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800902 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800905 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800908 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800910 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:25.803533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800913 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800916 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800919 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800922 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800924 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800927 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800929 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800932 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800934 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800937 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800939 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800942 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800944 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800946 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800949 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800952 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800954 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800957 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800959 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800962 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:25.803999 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800980 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800983 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800986 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800989 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800992 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800995 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.800997 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801000 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801002 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801005 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801008 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801011 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801013 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801016 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801018 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801023 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801026 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801029 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801031 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801034 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:25.804579 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801036 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801039 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801041 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801044 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801046 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801049 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801052 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801054 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801057 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801059 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801062 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801064 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801067 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:25.801070 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.801074 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:25.805085 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.802072 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:11:25.805533 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.804181 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:11:25.805533 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.805231 2575 server.go:1019] "Starting client certificate rotation" Apr 16 13:11:25.805533 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.805329 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:11:25.805533 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.805368 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:11:25.832725 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.832700 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:11:25.835337 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.835316 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:11:25.853707 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.853677 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:11:25.859389 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.859365 2575 log.go:25] "Validated CRI v1 image API" Apr 16 13:11:25.860575 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.860558 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:11:25.861315 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.861296 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:11:25.863183 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.863156 2575 fs.go:135] Filesystem UUIDs: map[4de33e25-aff8-4f89-915b-d3182a438c70:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e310d190-85b1-42f9-b3ff-7c1a8236859e:/dev/nvme0n1p3] Apr 16 13:11:25.863250 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.863183 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:11:25.869362 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.869240 2575 manager.go:217] Machine: {Timestamp:2026-04-16 13:11:25.867205684 +0000 UTC m=+0.456582763 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099319 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec246cf912fa873ec9af7ca258c6653a SystemUUID:ec246cf9-12fa-873e-c9af-7ca258c6653a BootID:160a1327-53d3-403e-ab4f-3fc158be2d29 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9c:c7:68:40:c7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9c:c7:68:40:c7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:20:85:f7:68:4d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:11:25.869461 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.869377 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:11:25.869498 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.869463 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:11:25.871265 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.871241 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:11:25.871416 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.871267 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-166.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:11:25.871466 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.871425 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:11:25.871466 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.871435 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:11:25.871466 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.871448 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:11:25.871466 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.871462 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:11:25.873349 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.873337 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:11:25.873453 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.873443 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:11:25.876065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.876055 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:11:25.876102 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.876070 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:11:25.876102 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.876083 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:11:25.876102 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.876092 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:11:25.876235 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.876103 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:11:25.877255 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.877242 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:11:25.877330 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.877262 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:11:25.881522 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.881504 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:11:25.883463 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.883447 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:11:25.885044 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885032 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885049 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885055 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885061 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885067 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885073 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885078 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885084 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885090 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:11:25.885095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885096 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:11:25.885357 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885109 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:11:25.885357 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.885133 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:11:25.886103 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.886092 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:11:25.886103 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.886102 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:11:25.889986 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.889967 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:11:25.890068 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.890024 2575 server.go:1295] "Started kubelet" Apr 16 13:11:25.890741 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.890672 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:11:25.890848 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.890761 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:11:25.890983 ip-10-0-142-166 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:11:25.891179 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.891141 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:11:25.891251 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.891193 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:11:25.891302 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.891270 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:11:25.891997 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.891979 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:11:25.895147 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.894943 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:11:25.900461 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.900444 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:11:25.901019 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.900994 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:11:25.901648 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.900516 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-166.ec2.internal.18a6d871b2fbb879 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-166.ec2.internal,UID:ip-10-0-142-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-166.ec2.internal,},FirstTimestamp:2026-04-16 13:11:25.889984633 +0000 UTC m=+0.479361713,LastTimestamp:2026-04-16 13:11:25.889984633 +0000 UTC m=+0.479361713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-166.ec2.internal,}" Apr 16 13:11:25.901756 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.901686 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:11:25.901945 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.901920 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-75m4p" Apr 16 13:11:25.902032 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.901953 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:11:25.902353 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902330 2575 factory.go:55] Registering systemd factory Apr 16 13:11:25.902434 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902376 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:11:25.902500 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902485 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:11:25.902566 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902504 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:11:25.902622 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902586 2575 factory.go:153] Registering CRI-O factory Apr 16 13:11:25.902622 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902601 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 13:11:25.902710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902647 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:11:25.902710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902655 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:11:25.902710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902661 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:11:25.902710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902485 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:11:25.902710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902684 2575 factory.go:103] Registering Raw factory Apr 16 13:11:25.902710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.902700 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 13:11:25.902953 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.902711 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:25.903165 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.903151 2575 manager.go:319] Starting recovery of all containers Apr 16 13:11:25.906944 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.906804 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-75m4p" Apr 16 13:11:25.907785 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.907761 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:11:25.907879 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.907854 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:11:25.910094 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.910065 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:11:25.912974 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.912957 2575 manager.go:324] Recovery completed Apr 16 13:11:25.917476 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.917461 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:25.922018 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.922004 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:25.922089 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.922029 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:25.922089 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.922039 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:25.923175 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.923158 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:11:25.923175 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.923172 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:11:25.923319 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.923188 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:11:25.926776 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.926761 2575 policy_none.go:49] "None policy: Start" Apr 16 13:11:25.926845 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.926779 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:11:25.926845 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.926789 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.973735 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.973767 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.973780 2575 server.go:85] "Starting device plugin registration server" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.974048 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.974058 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.974147 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.974219 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:25.974225 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.974758 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:11:25.987761 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:25.974799 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.042065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.041988 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:11:26.042065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.042021 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:11:26.042065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.042041 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:11:26.042065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.042048 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:11:26.042286 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.042081 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:11:26.044354 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.044333 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:26.074336 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.074290 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:26.075453 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.075436 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:26.075578 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.075470 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:26.075578 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.075485 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:26.075578 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.075516 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.084847 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.084817 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.084847 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.084841 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-166.ec2.internal\": node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.102936 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.102904 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.143095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.143063 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal"] Apr 16 13:11:26.143210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.143177 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:26.144878 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.144858 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:26.144995 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.144887 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:26.144995 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.144899 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:26.146667 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.146650 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:26.146820 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.146805 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.146875 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.146838 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:26.147378 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.147364 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:26.147456 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.147390 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:26.147456 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.147403 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:26.147456 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.147369 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:26.147594 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.147462 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:26.147594 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.147479 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:26.149727 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.149711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.149805 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.149740 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:26.150485 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.150465 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:26.150560 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.150492 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:26.150560 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.150502 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:26.170849 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.170829 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-166.ec2.internal\" not found" node="ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.175484 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.175469 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-166.ec2.internal\" not found" node="ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.203629 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.203607 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.304255 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.304175 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.304255 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.304236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2353abf503b380cff8d320261ba224aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"2353abf503b380cff8d320261ba224aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.304377 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.304266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2353abf503b380cff8d320261ba224aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"2353abf503b380cff8d320261ba224aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.304377 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.304283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b77342772725edf19265f6dcbdde9121-config\") pod \"kube-apiserver-proxy-ip-10-0-142-166.ec2.internal\" (UID: \"b77342772725edf19265f6dcbdde9121\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.404837 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.404808 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.404908 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.404867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2353abf503b380cff8d320261ba224aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"2353abf503b380cff8d320261ba224aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.404908 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.404897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2353abf503b380cff8d320261ba224aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"2353abf503b380cff8d320261ba224aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.404968 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.404912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b77342772725edf19265f6dcbdde9121-config\") pod \"kube-apiserver-proxy-ip-10-0-142-166.ec2.internal\" (UID: \"b77342772725edf19265f6dcbdde9121\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.404968 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.404951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2353abf503b380cff8d320261ba224aa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"2353abf503b380cff8d320261ba224aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.404968 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.404951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2353abf503b380cff8d320261ba224aa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"2353abf503b380cff8d320261ba224aa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.405056 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.404966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b77342772725edf19265f6dcbdde9121-config\") pod \"kube-apiserver-proxy-ip-10-0-142-166.ec2.internal\" (UID: \"b77342772725edf19265f6dcbdde9121\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.475068 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.475025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.478218 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.477971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 16 13:11:26.504922 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.504890 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.605482 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.605397 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.705943 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.705900 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.734451 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.734426 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:26.804901 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.804867 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:11:26.805507 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.805014 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:11:26.805507 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.805053 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:11:26.807045 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.807028 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.902167 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.902092 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:11:26.908043 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:26.908020 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:26.909422 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.909393 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:06:25 +0000 UTC" deadline="2028-01-18 05:33:07.454008568 +0000 UTC" Apr 16 13:11:26.909472 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.909424 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15400h21m40.544588242s" Apr 16 13:11:26.921349 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.921328 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:11:26.941613 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.941590 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-27ds2" Apr 16 13:11:26.950715 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.950697 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-27ds2" Apr 16 13:11:26.970470 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:26.970439 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2353abf503b380cff8d320261ba224aa.slice/crio-2b64ffa49142e21de5ace5d647667d2dae47f9fb221b185ac739618e4911dc3f WatchSource:0}: Error finding container 2b64ffa49142e21de5ace5d647667d2dae47f9fb221b185ac739618e4911dc3f: Status 404 returned error can't find the container with id 2b64ffa49142e21de5ace5d647667d2dae47f9fb221b185ac739618e4911dc3f Apr 16 13:11:26.970699 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:26.970678 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77342772725edf19265f6dcbdde9121.slice/crio-aba4fd40a4d617cff6d8666a54bf9d78f9a02515143298431b0e7ef325e78a19 WatchSource:0}: Error finding container aba4fd40a4d617cff6d8666a54bf9d78f9a02515143298431b0e7ef325e78a19: Status 404 returned error can't find the container with id aba4fd40a4d617cff6d8666a54bf9d78f9a02515143298431b0e7ef325e78a19 Apr 16 13:11:26.974207 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:26.974191 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:11:27.009054 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:27.009009 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:27.045317 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.045262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" event={"ID":"2353abf503b380cff8d320261ba224aa","Type":"ContainerStarted","Data":"2b64ffa49142e21de5ace5d647667d2dae47f9fb221b185ac739618e4911dc3f"} Apr 16 13:11:27.046184 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.046164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" event={"ID":"b77342772725edf19265f6dcbdde9121","Type":"ContainerStarted","Data":"aba4fd40a4d617cff6d8666a54bf9d78f9a02515143298431b0e7ef325e78a19"} Apr 16 13:11:27.109616 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:27.109583 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:27.210033 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:27.209957 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 16 13:11:27.238403 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.238365 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:27.302113 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.302072 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 16 13:11:27.311713 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.311686 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:11:27.312788 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.312768 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 16 13:11:27.323004 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.322988 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:11:27.432642 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.432613 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:27.877138 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.877094 2575 apiserver.go:52] "Watching apiserver" Apr 16 13:11:27.882945 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.882922 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:11:27.884922 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.884889 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-h9klv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal","openshift-multus/multus-wfgv8","openshift-network-operator/iptables-alerter-8cszh","openshift-ovn-kubernetes/ovnkube-node-jzvgp","kube-system/konnectivity-agent-glspk","kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal","openshift-cluster-node-tuning-operator/tuned-glppk","openshift-dns/node-resolver-g4rk4","openshift-image-registry/node-ca-psf9p","openshift-multus/multus-additional-cni-plugins-66xwt","openshift-multus/network-metrics-daemon-9b59n"] Apr 16 13:11:27.888733 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.888711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:27.890922 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.890895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.891473 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.891447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9d5jt\"" Apr 16 13:11:27.891473 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.891465 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:11:27.891999 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.891811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:11:27.892793 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.892690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:11:27.893045 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.893027 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.893796 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.893775 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.893886 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.893786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p8gp2\"" Apr 16 13:11:27.895882 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.895455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.897658 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.897640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:11:27.897788 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.897772 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.897867 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.897826 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:27.898393 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.898375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:11:27.898490 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.898414 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d598s\"" Apr 16 13:11:27.898570 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.898538 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.900436 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.900378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.900529 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.900492 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.900739 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.900722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:11:27.900943 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.900927 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-72fwf\"" Apr 16 13:11:27.902454 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.902393 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.902548 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.902518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:27.902716 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:27.902597 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:27.905149 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.904787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.905149 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.905056 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:11:27.905290 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.905228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:11:27.911143 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.908667 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:11:27.911143 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.908958 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.911143 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.909871 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.911143 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.910223 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:11:27.911143 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.910947 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.911143 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.910957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.911513 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.911162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mmsmj\"" Apr 16 13:11:27.911513 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.911446 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nfhhc\"" Apr 16 13:11:27.913242 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-cni-netd\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.913466 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-lib-modules\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.913588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-etc-kubernetes\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.913705 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:27.913814 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/21bff1bd-c48b-4833-b3b3-ce5f1230db72-agent-certs\") pod \"konnectivity-agent-glspk\" (UID: \"21bff1bd-c48b-4833-b3b3-ce5f1230db72\") " pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:27.913814 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-hostroot\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.913814 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysconfig\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.913814 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-kubernetes\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.914020 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.913846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-host-slash\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:27.914076 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-ovnkube-config\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.914251 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-os-release\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.914426 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-netns\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.914506 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-etc-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.914506 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-log-socket\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.914602 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-env-overrides\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.914602 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-ovnkube-script-lib\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.914701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g678p\" (UniqueName: \"kubernetes.io/projected/25d5ba90-b543-425c-9992-d5d1d1a63331-kube-api-access-g678p\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.914701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:27.914701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-conf-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.914846 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-iptables-alerter-script\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:27.914846 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysctl-conf\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.914846 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-kubelet\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.915000 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-run\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.915000 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-device-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.915000 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-etc-selinux\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.915000 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.914984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-k8s-cni-cncf-io\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.915206 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-multus-certs\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.915206 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxxw\" (UniqueName: \"kubernetes.io/projected/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-kube-api-access-7zxxw\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:27.915206 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-run-ovn-kubernetes\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.915436 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-systemd\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.915436 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-registration-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.915436 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-modprobe-d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.915436 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-sys-fs\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-cni-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-ovn\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f10722c-b916-4e71-b492-f2605d67db12-etc-tuned\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f10722c-b916-4e71-b492-f2605d67db12-tmp\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-socket-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pn8p4\"" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-cnibin\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.915687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-socket-dir-parent\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915704 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-daemon-config\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-slash\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-sys\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-var-lib-kubelet\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-kubelet\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.915954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5svk6\" (UniqueName: \"kubernetes.io/projected/c41aae66-1614-4b68-99e9-dae826ba8bff-kube-api-access-5svk6\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-host\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.916181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-system-cni-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.916676 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c41aae66-1614-4b68-99e9-dae826ba8bff-cni-binary-copy\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.916779 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-systemd-units\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916779 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-run-netns\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916779 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-systemd\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916779 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-node-log\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d5ba90-b543-425c-9992-d5d1d1a63331-ovn-node-metrics-cert\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jv8f\" (UniqueName: \"kubernetes.io/projected/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-kube-api-access-9jv8f\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-var-lib-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-cni-bin\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxn7d\" (UniqueName: \"kubernetes.io/projected/4f10722c-b916-4e71-b492-f2605d67db12-kube-api-access-vxn7d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysctl-d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:27.916979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.916972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/21bff1bd-c48b-4833-b3b3-ce5f1230db72-konnectivity-ca\") pod \"konnectivity-agent-glspk\" (UID: \"21bff1bd-c48b-4833-b3b3-ce5f1230db72\") " pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:27.917442 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.917001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:27.917442 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.917025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-cni-bin\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.917442 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.917048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-cni-multus\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:27.917442 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.917074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:27.917825 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.917799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:27.919720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.919698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:11:27.920170 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.920040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xq9cv\"" Apr 16 13:11:27.920275 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.920254 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:11:27.921146 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.921111 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:11:27.922552 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.922222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:27.924595 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.924576 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:11:27.925192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.924817 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:11:27.925192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.925006 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f5965\"" Apr 16 13:11:27.925798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.925511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:27.925798 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:27.925577 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:27.951860 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.951703 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:06:26 +0000 UTC" deadline="2028-01-20 07:04:09.24571663 +0000 UTC" Apr 16 13:11:27.951860 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:27.951741 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15449h52m41.293987368s" Apr 16 13:11:28.003799 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.003718 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:11:28.017414 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysconfig\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-kubernetes\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5e2c6e75-298e-4014-bf52-0dc9f276e559-tmp-dir\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-os-release\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-kubernetes\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017530 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysconfig\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-host-slash\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-host-slash\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.017641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-ovnkube-config\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-os-release\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-netns\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xf5\" (UniqueName: \"kubernetes.io/projected/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-kube-api-access-v6xf5\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-netns\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-os-release\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-etc-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.017976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-log-socket\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-env-overrides\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-etc-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018453 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-log-socket\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018453 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-ovnkube-script-lib\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018453 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g678p\" (UniqueName: \"kubernetes.io/projected/25d5ba90-b543-425c-9992-d5d1d1a63331-kube-api-access-g678p\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018453 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:28.018737 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-conf-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.018819 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.018819 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-iptables-alerter-script\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.018926 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysctl-conf\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.018926 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-kubelet\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.018926 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-run\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019051 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-device-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019051 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.018982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-etc-selinux\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019051 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-kubelet\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.019051 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-k8s-cni-cncf-io\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.019256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-multus-certs\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.019256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysctl-conf\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7xm\" (UniqueName: \"kubernetes.io/projected/5e2c6e75-298e-4014-bf52-0dc9f276e559-kube-api-access-mn7xm\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.019256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-k8s-cni-cncf-io\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.019256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-run-multus-certs\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.019256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019283 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-run\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxxw\" (UniqueName: \"kubernetes.io/projected/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-kube-api-access-7zxxw\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-device-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-iptables-alerter-script\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-ovnkube-script-lib\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-run-ovn-kubernetes\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-etc-selinux\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-env-overrides\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.019549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-systemd\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-registration-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-conf-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-run-ovn-kubernetes\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-systemd\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrt8\" (UniqueName: \"kubernetes.io/projected/8156b1b5-dde0-4575-9f54-ea6d2acf9495-kube-api-access-ffrt8\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-registration-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-modprobe-d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-sys-fs\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-cni-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-modprobe-d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-sys-fs\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.019948 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e2c6e75-298e-4014-bf52-0dc9f276e559-hosts-file\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-serviceca\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.019984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-cni-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-ovn\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f10722c-b916-4e71-b492-f2605d67db12-etc-tuned\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f10722c-b916-4e71-b492-f2605d67db12-tmp\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25d5ba90-b543-425c-9992-d5d1d1a63331-ovnkube-config\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-socket-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-ovn\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-cnibin\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-socket-dir-parent\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-daemon-config\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-system-cni-dir\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020242 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-slash\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-socket-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-sys\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-var-lib-kubelet\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-kubelet\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.020519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-cnibin\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5svk6\" (UniqueName: \"kubernetes.io/projected/c41aae66-1614-4b68-99e9-dae826ba8bff-kube-api-access-5svk6\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-host\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-socket-dir-parent\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-system-cni-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-var-lib-kubelet\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c41aae66-1614-4b68-99e9-dae826ba8bff-cni-binary-copy\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020407 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-systemd-units\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-sys\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.021313 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c41aae66-1614-4b68-99e9-dae826ba8bff-multus-daemon-config\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.021737 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-host\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.021737 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-run-netns\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021737 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-systemd\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021737 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-node-log\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021737 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d5ba90-b543-425c-9992-d5d1d1a63331-ovn-node-metrics-cert\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021955 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jv8f\" (UniqueName: \"kubernetes.io/projected/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-kube-api-access-9jv8f\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.021955 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-var-lib-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021955 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-cni-bin\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021955 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.021955 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxn7d\" (UniqueName: \"kubernetes.io/projected/4f10722c-b916-4e71-b492-f2605d67db12-kube-api-access-vxn7d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.021955 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jln8m\" (UniqueName: \"kubernetes.io/projected/0865faea-916d-435f-88f5-d2b559f1d79a-kube-api-access-jln8m\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.021991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysctl-d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/21bff1bd-c48b-4833-b3b3-ce5f1230db72-konnectivity-ca\") pod \"konnectivity-agent-glspk\" (UID: \"21bff1bd-c48b-4833-b3b3-ce5f1230db72\") " pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-run-netns\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-cni-bin\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.022266 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-systemd\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.022612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-systemd-units\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.022612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-cni-bin\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.022612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-node-log\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.022612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.020430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-slash\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.022612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-cni-multus\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.022612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-system-cni-dir\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.024915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.023769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-kubelet\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.024915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.024155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-cni-bin\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.024915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.024225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-var-lib-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.024915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.024307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.024915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.024379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c41aae66-1614-4b68-99e9-dae826ba8bff-cni-binary-copy\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.024915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.024478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-etc-sysctl-d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.025288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.024935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/21bff1bd-c48b-4833-b3b3-ce5f1230db72-konnectivity-ca\") pod \"konnectivity-agent-glspk\" (UID: \"21bff1bd-c48b-4833-b3b3-ce5f1230db72\") " pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:28.025288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.022212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-host-var-lib-cni-multus\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.025288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.025288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-cni-netd\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.025288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-lib-modules\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-etc-kubernetes\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cni-binary-copy\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/21bff1bd-c48b-4833-b3b3-ce5f1230db72-agent-certs\") pod \"konnectivity-agent-glspk\" (UID: \"21bff1bd-c48b-4833-b3b3-ce5f1230db72\") " pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-hostroot\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-host\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.025526 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cnibin\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.025832 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-run-openvswitch\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.025832 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25d5ba90-b543-425c-9992-d5d1d1a63331-host-cni-netd\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.025832 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f10722c-b916-4e71-b492-f2605d67db12-lib-modules\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.025832 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.025779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-etc-kubernetes\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.027512 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.026266 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:28.027512 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.026295 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:28.027512 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.026328 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:28.027512 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.026473 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:28.52643684 +0000 UTC m=+3.115813924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:28.027512 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.027436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c41aae66-1614-4b68-99e9-dae826ba8bff-hostroot\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.034260 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.033424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxn7d\" (UniqueName: \"kubernetes.io/projected/4f10722c-b916-4e71-b492-f2605d67db12-kube-api-access-vxn7d\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.034260 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.033547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5svk6\" (UniqueName: \"kubernetes.io/projected/c41aae66-1614-4b68-99e9-dae826ba8bff-kube-api-access-5svk6\") pod \"multus-wfgv8\" (UID: \"c41aae66-1614-4b68-99e9-dae826ba8bff\") " pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.035485 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.034520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/21bff1bd-c48b-4833-b3b3-ce5f1230db72-agent-certs\") pod \"konnectivity-agent-glspk\" (UID: \"21bff1bd-c48b-4833-b3b3-ce5f1230db72\") " pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:28.035485 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.035147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxxw\" (UniqueName: \"kubernetes.io/projected/c8f2f3e3-6cf2-490d-bb89-add0d52d2809-kube-api-access-7zxxw\") pod \"iptables-alerter-8cszh\" (UID: \"c8f2f3e3-6cf2-490d-bb89-add0d52d2809\") " pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.035485 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.035346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f10722c-b916-4e71-b492-f2605d67db12-tmp\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.035682 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.035537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f10722c-b916-4e71-b492-f2605d67db12-etc-tuned\") pod \"tuned-glppk\" (UID: \"4f10722c-b916-4e71-b492-f2605d67db12\") " pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.036000 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.035937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g678p\" (UniqueName: \"kubernetes.io/projected/25d5ba90-b543-425c-9992-d5d1d1a63331-kube-api-access-g678p\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.038077 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.038048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jv8f\" (UniqueName: \"kubernetes.io/projected/8eb1ce78-01ab-41e7-bc32-6dbe30df94e4-kube-api-access-9jv8f\") pod \"aws-ebs-csi-driver-node-69l49\" (UID: \"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.038763 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.038739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d5ba90-b543-425c-9992-d5d1d1a63331-ovn-node-metrics-cert\") pod \"ovnkube-node-jzvgp\" (UID: \"25d5ba90-b543-425c-9992-d5d1d1a63331\") " pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.125827 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.125796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-system-cni-dir\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126008 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.125850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:28.126008 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.125876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jln8m\" (UniqueName: \"kubernetes.io/projected/0865faea-916d-435f-88f5-d2b559f1d79a-kube-api-access-jln8m\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:28.126008 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.125910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cni-binary-copy\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126008 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.125953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126008 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.125980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-host\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.126008 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cnibin\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126209 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5e2c6e75-298e-4014-bf52-0dc9f276e559-tmp-dir\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.126209 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-os-release\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126209 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xf5\" (UniqueName: \"kubernetes.io/projected/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-kube-api-access-v6xf5\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.126209 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126209 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7xm\" (UniqueName: \"kubernetes.io/projected/5e2c6e75-298e-4014-bf52-0dc9f276e559-kube-api-access-mn7xm\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.126209 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126464 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffrt8\" (UniqueName: \"kubernetes.io/projected/8156b1b5-dde0-4575-9f54-ea6d2acf9495-kube-api-access-ffrt8\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126464 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e2c6e75-298e-4014-bf52-0dc9f276e559-hosts-file\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.126464 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-serviceca\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.126464 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-system-cni-dir\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126464 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.126423 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:28.126701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cnibin\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126701 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.126534 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:28.626514507 +0000 UTC m=+3.215891590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:28.126701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5e2c6e75-298e-4014-bf52-0dc9f276e559-tmp-dir\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.126701 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-os-release\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.126962 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-serviceca\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.126962 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-host\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.126962 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.126875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e2c6e75-298e-4014-bf52-0dc9f276e559-hosts-file\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.127101 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.127040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cni-binary-copy\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.127193 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.127143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8156b1b5-dde0-4575-9f54-ea6d2acf9495-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.127247 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.127210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8156b1b5-dde0-4575-9f54-ea6d2acf9495-tuning-conf-dir\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.135624 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.135589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7xm\" (UniqueName: \"kubernetes.io/projected/5e2c6e75-298e-4014-bf52-0dc9f276e559-kube-api-access-mn7xm\") pod \"node-resolver-g4rk4\" (UID: \"5e2c6e75-298e-4014-bf52-0dc9f276e559\") " pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.135787 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.135106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xf5\" (UniqueName: \"kubernetes.io/projected/fd1864d2-5d1c-41a0-84d9-dd4835e795d5-kube-api-access-v6xf5\") pod \"node-ca-psf9p\" (UID: \"fd1864d2-5d1c-41a0-84d9-dd4835e795d5\") " pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.135787 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.135653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jln8m\" (UniqueName: \"kubernetes.io/projected/0865faea-916d-435f-88f5-d2b559f1d79a-kube-api-access-jln8m\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:28.136317 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.136289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffrt8\" (UniqueName: \"kubernetes.io/projected/8156b1b5-dde0-4575-9f54-ea6d2acf9495-kube-api-access-ffrt8\") pod \"multus-additional-cni-plugins-66xwt\" (UID: \"8156b1b5-dde0-4575-9f54-ea6d2acf9495\") " pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.202256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.202215 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:28.216291 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.216262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" Apr 16 13:11:28.227075 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.227040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wfgv8" Apr 16 13:11:28.235388 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.235351 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8cszh" Apr 16 13:11:28.244091 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.244071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:28.252770 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.252750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-glppk" Apr 16 13:11:28.260342 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.260321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g4rk4" Apr 16 13:11:28.269868 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.269851 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-psf9p" Apr 16 13:11:28.276425 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.276409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-66xwt" Apr 16 13:11:28.339383 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.339355 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:28.530656 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.530622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:28.530844 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.530755 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:28.530844 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.530776 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:28.530844 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.530786 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:28.531035 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.530866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:29.530841408 +0000 UTC m=+4.120218480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:28.631660 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.631626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:28.631807 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.631782 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:28.631858 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:28.631847 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:29.631831457 +0000 UTC m=+4.221208525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:28.653533 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.653493 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2c6e75_298e_4014_bf52_0dc9f276e559.slice/crio-996989f96f75b50e98b2a05de32ab836015468220c0e3ecfcdcc9c4d018c471e WatchSource:0}: Error finding container 996989f96f75b50e98b2a05de32ab836015468220c0e3ecfcdcc9c4d018c471e: Status 404 returned error can't find the container with id 996989f96f75b50e98b2a05de32ab836015468220c0e3ecfcdcc9c4d018c471e Apr 16 13:11:28.655326 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.655296 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d5ba90_b543_425c_9992_d5d1d1a63331.slice/crio-eff09046edc8be1d055fe89e0d2f652caa1c295d5ef012ba23bcfd62684727a4 WatchSource:0}: Error finding container eff09046edc8be1d055fe89e0d2f652caa1c295d5ef012ba23bcfd62684727a4: Status 404 returned error can't find the container with id eff09046edc8be1d055fe89e0d2f652caa1c295d5ef012ba23bcfd62684727a4 Apr 16 13:11:28.658106 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.658079 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21bff1bd_c48b_4833_b3b3_ce5f1230db72.slice/crio-9eca21493cf1a94d621c1a5b27209998d4f70b069ae4970a84f915c3f28da7f4 WatchSource:0}: Error finding container 9eca21493cf1a94d621c1a5b27209998d4f70b069ae4970a84f915c3f28da7f4: Status 404 returned error can't find the container with id 9eca21493cf1a94d621c1a5b27209998d4f70b069ae4970a84f915c3f28da7f4 Apr 16 13:11:28.658957 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.658889 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eb1ce78_01ab_41e7_bc32_6dbe30df94e4.slice/crio-d71c779a3a95a11c26970d5c55be65180654f9d0df1454bdb002a4d9c771812e WatchSource:0}: Error finding container d71c779a3a95a11c26970d5c55be65180654f9d0df1454bdb002a4d9c771812e: Status 404 returned error can't find the container with id d71c779a3a95a11c26970d5c55be65180654f9d0df1454bdb002a4d9c771812e Apr 16 13:11:28.667801 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.667781 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:28.681240 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.681061 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1864d2_5d1c_41a0_84d9_dd4835e795d5.slice/crio-c28592c989037073ea19a84a5a198e954c4e3006b85b49d762d8bbaeb3c3a8a6 WatchSource:0}: Error finding container c28592c989037073ea19a84a5a198e954c4e3006b85b49d762d8bbaeb3c3a8a6: Status 404 returned error can't find the container with id c28592c989037073ea19a84a5a198e954c4e3006b85b49d762d8bbaeb3c3a8a6 Apr 16 13:11:28.681515 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.681487 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f10722c_b916_4e71_b492_f2605d67db12.slice/crio-ce7664f36ccb6590a8c5983571d52eb436310140ed12ad99250524542060abc1 WatchSource:0}: Error finding container ce7664f36ccb6590a8c5983571d52eb436310140ed12ad99250524542060abc1: Status 404 returned error can't find the container with id ce7664f36ccb6590a8c5983571d52eb436310140ed12ad99250524542060abc1 Apr 16 13:11:28.682455 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.682359 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41aae66_1614_4b68_99e9_dae826ba8bff.slice/crio-8ae7460991eeb878421732384c06c84f29521bf42ae7e05da100d1bd7bbaac05 WatchSource:0}: Error finding container 8ae7460991eeb878421732384c06c84f29521bf42ae7e05da100d1bd7bbaac05: Status 404 returned error can't find the container with id 8ae7460991eeb878421732384c06c84f29521bf42ae7e05da100d1bd7bbaac05 Apr 16 13:11:28.683311 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.683222 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f2f3e3_6cf2_490d_bb89_add0d52d2809.slice/crio-873071248e8d65d2141dbada7666861b521d24d1449bbf3a146ed00da2168ea0 WatchSource:0}: Error finding container 873071248e8d65d2141dbada7666861b521d24d1449bbf3a146ed00da2168ea0: Status 404 returned error can't find the container with id 873071248e8d65d2141dbada7666861b521d24d1449bbf3a146ed00da2168ea0 Apr 16 13:11:28.684431 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:28.684363 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8156b1b5_dde0_4575_9f54_ea6d2acf9495.slice/crio-95a808d350165c3736acd1bdbc79298731eaf77eab89e3e940dcd9ecda378e84 WatchSource:0}: Error finding container 95a808d350165c3736acd1bdbc79298731eaf77eab89e3e940dcd9ecda378e84: Status 404 returned error can't find the container with id 95a808d350165c3736acd1bdbc79298731eaf77eab89e3e940dcd9ecda378e84 Apr 16 13:11:28.952149 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.951888 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:06:26 +0000 UTC" deadline="2027-11-17 22:30:15.259180715 +0000 UTC" Apr 16 13:11:28.952149 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:28.952070 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13929h18m46.307115201s" Apr 16 13:11:29.051403 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.051350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerStarted","Data":"95a808d350165c3736acd1bdbc79298731eaf77eab89e3e940dcd9ecda378e84"} Apr 16 13:11:29.052887 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.052833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8cszh" event={"ID":"c8f2f3e3-6cf2-490d-bb89-add0d52d2809","Type":"ContainerStarted","Data":"873071248e8d65d2141dbada7666861b521d24d1449bbf3a146ed00da2168ea0"} Apr 16 13:11:29.055838 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.055811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-glppk" event={"ID":"4f10722c-b916-4e71-b492-f2605d67db12","Type":"ContainerStarted","Data":"ce7664f36ccb6590a8c5983571d52eb436310140ed12ad99250524542060abc1"} Apr 16 13:11:29.059844 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.059789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" event={"ID":"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4","Type":"ContainerStarted","Data":"d71c779a3a95a11c26970d5c55be65180654f9d0df1454bdb002a4d9c771812e"} Apr 16 13:11:29.063978 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.063920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"eff09046edc8be1d055fe89e0d2f652caa1c295d5ef012ba23bcfd62684727a4"} Apr 16 13:11:29.069491 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.069426 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g4rk4" event={"ID":"5e2c6e75-298e-4014-bf52-0dc9f276e559","Type":"ContainerStarted","Data":"996989f96f75b50e98b2a05de32ab836015468220c0e3ecfcdcc9c4d018c471e"} Apr 16 13:11:29.075137 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.075085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" event={"ID":"b77342772725edf19265f6dcbdde9121","Type":"ContainerStarted","Data":"614428a694ee03f4a61c37a60cf80b453918148b074bd15fbe532a4650d290a3"} Apr 16 13:11:29.078241 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.078213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wfgv8" event={"ID":"c41aae66-1614-4b68-99e9-dae826ba8bff","Type":"ContainerStarted","Data":"8ae7460991eeb878421732384c06c84f29521bf42ae7e05da100d1bd7bbaac05"} Apr 16 13:11:29.081159 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.081111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-psf9p" event={"ID":"fd1864d2-5d1c-41a0-84d9-dd4835e795d5","Type":"ContainerStarted","Data":"c28592c989037073ea19a84a5a198e954c4e3006b85b49d762d8bbaeb3c3a8a6"} Apr 16 13:11:29.084057 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.084030 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-glspk" event={"ID":"21bff1bd-c48b-4833-b3b3-ce5f1230db72","Type":"ContainerStarted","Data":"9eca21493cf1a94d621c1a5b27209998d4f70b069ae4970a84f915c3f28da7f4"} Apr 16 13:11:29.537445 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.537411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:29.537606 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:29.537588 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:29.537650 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:29.537611 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:29.537650 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:29.537625 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:29.537713 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:29.537679 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:31.537665921 +0000 UTC m=+6.127042988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:29.638177 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:29.638133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:29.638360 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:29.638292 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:29.638360 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:29.638355 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:31.638337134 +0000 UTC m=+6.227714205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:30.042803 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:30.042369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:30.042803 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:30.042573 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:30.042803 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:30.042606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:30.042803 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:30.042740 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:30.099298 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:30.099261 2575 generic.go:358] "Generic (PLEG): container finished" podID="2353abf503b380cff8d320261ba224aa" containerID="f8f7de0eab8680154a71e4d090ae4753fd57fa5605a10d97afe9d584dc7047fd" exitCode=0 Apr 16 13:11:30.100187 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:30.100163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" event={"ID":"2353abf503b380cff8d320261ba224aa","Type":"ContainerDied","Data":"f8f7de0eab8680154a71e4d090ae4753fd57fa5605a10d97afe9d584dc7047fd"} Apr 16 13:11:30.118352 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:30.118299 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" podStartSLOduration=3.118283239 podStartE2EDuration="3.118283239s" podCreationTimestamp="2026-04-16 13:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:11:29.087193578 +0000 UTC m=+3.676570667" watchObservedRunningTime="2026-04-16 13:11:30.118283239 +0000 UTC m=+4.707660328" Apr 16 13:11:31.117067 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:31.117029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" event={"ID":"2353abf503b380cff8d320261ba224aa","Type":"ContainerStarted","Data":"07deb42654e180d353772ac3d0ccde07b66e7a7fb833adbef8450e506d76e567"} Apr 16 13:11:31.556897 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:31.556851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:31.557078 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:31.557050 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:31.557078 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:31.557068 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:31.557078 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:31.557081 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:31.557253 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:31.557169 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:35.557150941 +0000 UTC m=+10.146528033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:31.658171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:31.657598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:31.658171 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:31.657745 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:31.658171 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:31.657814 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:35.657795645 +0000 UTC m=+10.247172715 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:32.043368 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:32.043193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:32.043368 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:32.043320 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:32.043863 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:32.043652 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:32.043863 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:32.043769 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:34.043147 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:34.042598 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:34.043147 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:34.042757 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:34.043147 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:34.042951 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:34.043147 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:34.043071 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:35.594381 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.594253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:35.594875 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.594419 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:35.594875 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.594444 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:35.594875 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.594458 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:35.594875 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.594520 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:43.594501338 +0000 UTC m=+18.183878409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:35.695241 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.695206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:35.695431 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.695378 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:35.695494 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.695459 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:43.69543553 +0000 UTC m=+18.284812598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:35.888016 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.887606 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" podStartSLOduration=8.887587269 podStartE2EDuration="8.887587269s" podCreationTimestamp="2026-04-16 13:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:11:31.130220269 +0000 UTC m=+5.719597371" watchObservedRunningTime="2026-04-16 13:11:35.887587269 +0000 UTC m=+10.476964357" Apr 16 13:11:35.888215 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.888043 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hqc54"] Apr 16 13:11:35.892330 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.892301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:35.892457 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:35.892376 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:35.998418 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.998385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/454de3ce-a596-4780-b1b7-e2fe418de97e-kubelet-config\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:35.998559 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.998425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/454de3ce-a596-4780-b1b7-e2fe418de97e-dbus\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:35.998559 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:35.998502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.043650 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.043615 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:36.043821 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:36.043726 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:36.043821 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.043765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:36.043928 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:36.043848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:36.099826 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.099793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/454de3ce-a596-4780-b1b7-e2fe418de97e-kubelet-config\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.100031 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.099842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/454de3ce-a596-4780-b1b7-e2fe418de97e-dbus\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.100031 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.099916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.100182 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.100032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/454de3ce-a596-4780-b1b7-e2fe418de97e-kubelet-config\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.100182 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:36.100040 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:36.100182 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:36.100131 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret podName:454de3ce-a596-4780-b1b7-e2fe418de97e nodeName:}" failed. No retries permitted until 2026-04-16 13:11:36.600097118 +0000 UTC m=+11.189474188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret") pod "global-pull-secret-syncer-hqc54" (UID: "454de3ce-a596-4780-b1b7-e2fe418de97e") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:36.100357 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.100209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/454de3ce-a596-4780-b1b7-e2fe418de97e-dbus\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.604411 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:36.604373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:36.604893 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:36.604547 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:36.604893 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:36.604605 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret podName:454de3ce-a596-4780-b1b7-e2fe418de97e nodeName:}" failed. No retries permitted until 2026-04-16 13:11:37.604586871 +0000 UTC m=+12.193963943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret") pod "global-pull-secret-syncer-hqc54" (UID: "454de3ce-a596-4780-b1b7-e2fe418de97e") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:37.611688 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:37.611639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:37.612200 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:37.611803 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:37.612200 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:37.611888 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret podName:454de3ce-a596-4780-b1b7-e2fe418de97e nodeName:}" failed. No retries permitted until 2026-04-16 13:11:39.611867785 +0000 UTC m=+14.201244862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret") pod "global-pull-secret-syncer-hqc54" (UID: "454de3ce-a596-4780-b1b7-e2fe418de97e") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:38.042502 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:38.042470 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:38.042682 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:38.042479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:38.042682 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:38.042594 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:38.042682 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:38.042482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:38.042848 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:38.042672 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:38.042848 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:38.042747 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:39.626921 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:39.626873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:39.627498 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:39.627032 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:39.627498 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:39.627114 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret podName:454de3ce-a596-4780-b1b7-e2fe418de97e nodeName:}" failed. No retries permitted until 2026-04-16 13:11:43.627093514 +0000 UTC m=+18.216470581 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret") pod "global-pull-secret-syncer-hqc54" (UID: "454de3ce-a596-4780-b1b7-e2fe418de97e") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:40.043192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:40.043154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:40.043192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:40.043182 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:40.043428 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:40.043163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:40.043428 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:40.043295 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:40.043428 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:40.043375 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:40.043568 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:40.043445 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:42.043065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:42.042975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:42.043539 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:42.043107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:42.043539 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:42.043111 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:42.043539 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:42.043234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:42.043539 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:42.043270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:42.043539 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:42.043353 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:43.656686 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:43.656649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:43.656697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.656806 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.656828 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.656855 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.656865 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret podName:454de3ce-a596-4780-b1b7-e2fe418de97e nodeName:}" failed. No retries permitted until 2026-04-16 13:11:51.656848549 +0000 UTC m=+26.246225623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret") pod "global-pull-secret-syncer-hqc54" (UID: "454de3ce-a596-4780-b1b7-e2fe418de97e") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.656868 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:43.657156 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.656919 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:59.656904132 +0000 UTC m=+34.246281213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:43.757620 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:43.757588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:43.757784 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.757760 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:43.757848 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:43.757836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:59.757815685 +0000 UTC m=+34.347192754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:44.042393 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:44.042352 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:44.042578 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:44.042362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:44.042578 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:44.042460 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:44.042578 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:44.042363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:44.042578 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:44.042563 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:44.042730 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:44.042666 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:46.043421 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.043138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:46.044036 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.043166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:46.044036 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:46.043506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:46.044036 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.043184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:46.044036 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:46.043568 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:46.044036 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:46.043663 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:46.143223 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.143176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wfgv8" event={"ID":"c41aae66-1614-4b68-99e9-dae826ba8bff","Type":"ContainerStarted","Data":"13b6329ca4d9a4887ac793ad2d7b8da6d2e6236687c73f773032b973f2818353"} Apr 16 13:11:46.144723 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.144689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-psf9p" event={"ID":"fd1864d2-5d1c-41a0-84d9-dd4835e795d5","Type":"ContainerStarted","Data":"7f654430b2824ce9685762a03565ca813a182271de0eb5ec4186540411227d51"} Apr 16 13:11:46.146595 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.146334 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-glspk" event={"ID":"21bff1bd-c48b-4833-b3b3-ce5f1230db72","Type":"ContainerStarted","Data":"d14a8d32aae52ad00ed07e2a8f058ea6fb8ee78ca1e90e8bb9ec16b786cd104d"} Apr 16 13:11:46.147799 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.147747 2575 generic.go:358] "Generic (PLEG): container finished" podID="8156b1b5-dde0-4575-9f54-ea6d2acf9495" containerID="da97dad2273ad801fe3ea1981b5f706a4e3d6d19fb8617c1ebc39cd50538a526" exitCode=0 Apr 16 13:11:46.147890 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.147820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerDied","Data":"da97dad2273ad801fe3ea1981b5f706a4e3d6d19fb8617c1ebc39cd50538a526"} Apr 16 13:11:46.149876 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.149548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-glppk" event={"ID":"4f10722c-b916-4e71-b492-f2605d67db12","Type":"ContainerStarted","Data":"9ecc7261695773cb378d46ae3a3f08a3ad437621a8656f647c2d3b1604944bde"} Apr 16 13:11:46.151289 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.151250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" event={"ID":"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4","Type":"ContainerStarted","Data":"d7e73a9a35d421dbbd97aca79d13bc130f377f508f2d5b1556743e958398dcb1"} Apr 16 13:11:46.153634 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:11:46.153951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153880 2575 generic.go:358] "Generic (PLEG): container finished" podID="25d5ba90-b543-425c-9992-d5d1d1a63331" containerID="a7cebbb1d3e8469d54d8c48ecb202b9ef4d7f2270424239f2ab5dc3bee530a6a" exitCode=1 Apr 16 13:11:46.153951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"b72e45723b085c30443c67d9f27fdb49d4a84397b39f13e1995dbcb395356e30"} Apr 16 13:11:46.153951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"c065b1e4352ff0967cdb1297ccbb565b599d400e32d604e5f2bc93dc4411406d"} Apr 16 13:11:46.153951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"c32f1abf3370f4b687374622f5004afcacf3ed50b7873e48db1ad5d28e36870e"} Apr 16 13:11:46.153951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerDied","Data":"a7cebbb1d3e8469d54d8c48ecb202b9ef4d7f2270424239f2ab5dc3bee530a6a"} Apr 16 13:11:46.154255 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.153963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"05d6e83af4a0451f5e6b63dd56e31c06f48f96de619dfe4d2f93997d9708b7fc"} Apr 16 13:11:46.156276 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.155879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g4rk4" event={"ID":"5e2c6e75-298e-4014-bf52-0dc9f276e559","Type":"ContainerStarted","Data":"ce10ca77280b5835d53a54fec290b74bca6a90e3a92a98c5f48848244d81effb"} Apr 16 13:11:46.157636 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.157588 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wfgv8" podStartSLOduration=3.408626094 podStartE2EDuration="20.157575193s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.688085198 +0000 UTC m=+3.277462268" lastFinishedPulling="2026-04-16 13:11:45.437034292 +0000 UTC m=+20.026411367" observedRunningTime="2026-04-16 13:11:46.155610892 +0000 UTC m=+20.744987981" watchObservedRunningTime="2026-04-16 13:11:46.157575193 +0000 UTC m=+20.746952282" Apr 16 13:11:46.177978 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.177939 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-psf9p" podStartSLOduration=3.471871363 podStartE2EDuration="20.177923701s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.683047872 +0000 UTC m=+3.272424952" lastFinishedPulling="2026-04-16 13:11:45.389100224 +0000 UTC m=+19.978477290" observedRunningTime="2026-04-16 13:11:46.166427858 +0000 UTC m=+20.755804964" watchObservedRunningTime="2026-04-16 13:11:46.177923701 +0000 UTC m=+20.767300791" Apr 16 13:11:46.178625 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.178598 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-glspk" podStartSLOduration=3.469170978 podStartE2EDuration="20.178589496s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.679567736 +0000 UTC m=+3.268944808" lastFinishedPulling="2026-04-16 13:11:45.388986255 +0000 UTC m=+19.978363326" observedRunningTime="2026-04-16 13:11:46.178046262 +0000 UTC m=+20.767423343" watchObservedRunningTime="2026-04-16 13:11:46.178589496 +0000 UTC m=+20.767966584" Apr 16 13:11:46.191455 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.191414 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-glppk" podStartSLOduration=3.476815912 podStartE2EDuration="20.191398793s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.683456184 +0000 UTC m=+3.272833453" lastFinishedPulling="2026-04-16 13:11:45.398039268 +0000 UTC m=+19.987416334" observedRunningTime="2026-04-16 13:11:46.19100789 +0000 UTC m=+20.780384978" watchObservedRunningTime="2026-04-16 13:11:46.191398793 +0000 UTC m=+20.780775885" Apr 16 13:11:46.243405 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:46.243363 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g4rk4" podStartSLOduration=3.5102160270000002 podStartE2EDuration="20.243348884s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.656289766 +0000 UTC m=+3.245666851" lastFinishedPulling="2026-04-16 13:11:45.389422639 +0000 UTC m=+19.978799708" observedRunningTime="2026-04-16 13:11:46.243271121 +0000 UTC m=+20.832648241" watchObservedRunningTime="2026-04-16 13:11:46.243348884 +0000 UTC m=+20.832725976" Apr 16 13:11:47.020915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.020882 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:11:47.159181 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.159142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" event={"ID":"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4","Type":"ContainerStarted","Data":"25b6bce5bbd2d568d37b08b92e27d5d2893c156542cdb145f85cf578a5a9e924"} Apr 16 13:11:47.162009 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.161983 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:11:47.162384 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.162355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"43eccd175a5de1d6a95a5458c2adf6abd74e5177bd0af763ee5c89fde5c27fdb"} Apr 16 13:11:47.163659 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.163625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8cszh" event={"ID":"c8f2f3e3-6cf2-490d-bb89-add0d52d2809","Type":"ContainerStarted","Data":"744cd0e2c59b8e25b375542ba79fe82df311f5ef3fa2bdb9b7ab5d45570077d9"} Apr 16 13:11:47.175566 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.175523 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8cszh" podStartSLOduration=4.474149198 podStartE2EDuration="21.175508799s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.687788717 +0000 UTC m=+3.277165789" lastFinishedPulling="2026-04-16 13:11:45.389148309 +0000 UTC m=+19.978525390" observedRunningTime="2026-04-16 13:11:47.175401348 +0000 UTC m=+21.764778437" watchObservedRunningTime="2026-04-16 13:11:47.175508799 +0000 UTC m=+21.764885887" Apr 16 13:11:47.988050 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.987898 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:11:47.020902259Z","UUID":"2263270a-c81e-4f46-8736-a83d4b270c6e","Handler":null,"Name":"","Endpoint":""} Apr 16 13:11:47.990599 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.990578 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:11:47.990721 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:47.990607 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:11:48.042265 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:48.042234 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:48.042417 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:48.042275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:48.042417 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:48.042244 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:48.042417 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:48.042359 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:48.042586 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:48.042441 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:48.042586 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:48.042539 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:48.608954 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:48.608865 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:48.609677 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:48.609660 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:49.170238 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:49.169962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" event={"ID":"8eb1ce78-01ab-41e7-bc32-6dbe30df94e4","Type":"ContainerStarted","Data":"bf23b2faf229bb95fe4c68bee749b10417ad8147a1a17cfc1aa1118138573d50"} Apr 16 13:11:49.173935 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:49.173911 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:11:49.174715 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:49.174690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"e9bd556b6f9791cf14329617cf2ae393d16a776f18a26634374d6fb76118acde"} Apr 16 13:11:49.185678 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:49.185640 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-69l49" podStartSLOduration=3.792554343 podStartE2EDuration="23.18562775s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.679503919 +0000 UTC m=+3.268880988" lastFinishedPulling="2026-04-16 13:11:48.072577323 +0000 UTC m=+22.661954395" observedRunningTime="2026-04-16 13:11:49.18531508 +0000 UTC m=+23.774692169" watchObservedRunningTime="2026-04-16 13:11:49.18562775 +0000 UTC m=+23.775004875" Apr 16 13:11:50.042576 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:50.042539 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:50.042576 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:50.042557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:50.043152 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:50.042539 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:50.043152 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:50.042667 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:50.043152 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:50.042738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:50.043152 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:50.042816 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:51.180387 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.180206 2575 generic.go:358] "Generic (PLEG): container finished" podID="8156b1b5-dde0-4575-9f54-ea6d2acf9495" containerID="723af0a803948bd7654ac5c11aa944b4fb70a7868e100d82adb187c5524235ba" exitCode=0 Apr 16 13:11:51.181196 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.180291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerDied","Data":"723af0a803948bd7654ac5c11aa944b4fb70a7868e100d82adb187c5524235ba"} Apr 16 13:11:51.183336 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.183317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:11:51.183657 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.183637 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"54e2e2651b8efd30c54690b682bcc43ea6efee4bf37d1aa2830de0a1787631c9"} Apr 16 13:11:51.183935 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.183914 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:51.184072 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.184058 2575 scope.go:117] "RemoveContainer" containerID="a7cebbb1d3e8469d54d8c48ecb202b9ef4d7f2270424239f2ab5dc3bee530a6a" Apr 16 13:11:51.199087 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.199067 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:51.360893 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.360863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:51.361041 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.360992 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:11:51.361582 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.361560 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-glspk" Apr 16 13:11:51.718126 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:51.718094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:51.718293 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:51.718212 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:51.718293 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:51.718275 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret podName:454de3ce-a596-4780-b1b7-e2fe418de97e nodeName:}" failed. No retries permitted until 2026-04-16 13:12:07.718261109 +0000 UTC m=+42.307638174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret") pod "global-pull-secret-syncer-hqc54" (UID: "454de3ce-a596-4780-b1b7-e2fe418de97e") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:52.045230 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.045204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:52.045363 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.045204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:52.045363 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:52.045316 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:52.045445 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.045204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:52.045445 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:52.045383 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:52.045517 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:52.045449 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:52.187429 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.187390 2575 generic.go:358] "Generic (PLEG): container finished" podID="8156b1b5-dde0-4575-9f54-ea6d2acf9495" containerID="d9bd1450f9233728e631ce6e92d91cf5d767b4f8261efaadcf06de8109d4eb0e" exitCode=0 Apr 16 13:11:52.187838 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.187472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerDied","Data":"d9bd1450f9233728e631ce6e92d91cf5d767b4f8261efaadcf06de8109d4eb0e"} Apr 16 13:11:52.191192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.191169 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:11:52.191528 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.191507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" event={"ID":"25d5ba90-b543-425c-9992-d5d1d1a63331","Type":"ContainerStarted","Data":"db01ab41b502de4dcc3b83b0b11b0d30023918244cd6ea1140170272ec61a924"} Apr 16 13:11:52.191743 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.191728 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:11:52.191965 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.191949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:52.205522 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.205499 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:52.231566 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.231511 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" podStartSLOduration=9.246452623 podStartE2EDuration="26.231496639s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.657371034 +0000 UTC m=+3.246748106" lastFinishedPulling="2026-04-16 13:11:45.642415056 +0000 UTC m=+20.231792122" observedRunningTime="2026-04-16 13:11:52.230930159 +0000 UTC m=+26.820307319" watchObservedRunningTime="2026-04-16 13:11:52.231496639 +0000 UTC m=+26.820873727" Apr 16 13:11:52.485623 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.485591 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hqc54"] Apr 16 13:11:52.485800 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.485692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:52.485800 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:52.485783 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:52.488891 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.488864 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9b59n"] Apr 16 13:11:52.489028 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.488957 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:52.489068 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:52.489045 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:52.489537 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.489501 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h9klv"] Apr 16 13:11:52.489612 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:52.489586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:52.489717 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:52.489694 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:53.195617 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:53.195530 2575 generic.go:358] "Generic (PLEG): container finished" podID="8156b1b5-dde0-4575-9f54-ea6d2acf9495" containerID="21e135ea3c82a7dbfc4c15127cc4d2e716e9f438c2f384a0b90489ddfda0ecd0" exitCode=0 Apr 16 13:11:53.196052 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:53.195614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerDied","Data":"21e135ea3c82a7dbfc4c15127cc4d2e716e9f438c2f384a0b90489ddfda0ecd0"} Apr 16 13:11:53.196052 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:53.195789 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:11:54.045522 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:54.045493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:54.045522 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:54.045511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:54.045755 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:54.045493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:54.045755 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:54.045610 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:54.045755 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:54.045698 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:54.045900 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:54.045792 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:54.197625 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:54.197595 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:11:56.044101 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.043865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:56.044540 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.043928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:56.044540 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.043954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:56.044540 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:56.044221 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:56.044540 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:56.044310 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:56.044540 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:56.044396 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:56.362351 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.362262 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:11:56.362538 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.362523 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:11:56.374777 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.374731 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" podUID="25d5ba90-b543-425c-9992-d5d1d1a63331" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 13:11:56.384459 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:56.384425 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" podUID="25d5ba90-b543-425c-9992-d5d1d1a63331" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 13:11:58.045610 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.045579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:58.046153 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.045691 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9klv" podUID="75e624b2-8f5a-4782-b50e-326781f4d00a" Apr 16 13:11:58.046153 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.045579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:58.046153 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.045579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:11:58.046153 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.045801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:11:58.046153 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.045909 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hqc54" podUID="454de3ce-a596-4780-b1b7-e2fe418de97e" Apr 16 13:11:58.232635 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.232603 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeReady" Apr 16 13:11:58.232802 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.232742 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:11:58.272218 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.271218 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w"] Apr 16 13:11:58.275660 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.275630 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn"] Apr 16 13:11:58.275829 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.275809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.277804 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.277746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 13:11:58.277804 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.277764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 13:11:58.278006 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.277850 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 13:11:58.278006 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.277857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 13:11:58.278139 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.278076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 13:11:58.278139 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.278083 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 13:11:58.278300 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.278187 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 13:11:58.278300 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.278258 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7d7cbb65c5-z5fm8"] Apr 16 13:11:58.278410 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.278338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.280088 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.280069 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pj6fp\"" Apr 16 13:11:58.280198 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.280188 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 13:11:58.281156 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.281137 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8"] Apr 16 13:11:58.281260 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.281227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.283430 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.283410 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:11:58.284687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.283735 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sxsq4\"" Apr 16 13:11:58.284687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.284191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:11:58.284687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.284269 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:11:58.285013 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.284977 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn"] Apr 16 13:11:58.285013 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.285010 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w"] Apr 16 13:11:58.285183 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.285026 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8"] Apr 16 13:11:58.285183 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.285143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.287060 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.287029 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d7cbb65c5-z5fm8"] Apr 16 13:11:58.287663 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.287224 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 13:11:58.288986 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.288968 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:11:58.299500 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.299433 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9r95z"] Apr 16 13:11:58.301933 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.301897 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-msm5w"] Apr 16 13:11:58.302878 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.302089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:58.304222 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.304205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:11:58.304319 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.304206 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:11:58.304319 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.304273 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrfd8\"" Apr 16 13:11:58.304431 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.304325 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:11:58.304653 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.304637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.307450 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.307165 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:11:58.307450 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.307224 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkkx5\"" Apr 16 13:11:58.307450 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.307175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:11:58.308088 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.308060 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9r95z"] Apr 16 13:11:58.308783 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.308762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-msm5w"] Apr 16 13:11:58.376288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/afea1806-f99c-4639-af8b-93b4c36066c4-klusterlet-config\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.376288 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-ca\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.376543 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd77993a-2dcc-45b2-bced-9f3ea48a6328-ca-trust-extracted\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376543 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-trusted-ca\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376543 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-installation-pull-secrets\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376543 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4tr\" (UniqueName: \"kubernetes.io/projected/afea1806-f99c-4639-af8b-93b4c36066c4-kube-api-access-px4tr\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.376543 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbb7\" (UniqueName: \"kubernetes.io/projected/fbf1244e-299c-4320-99ad-e305cdf1a83b-kube-api-access-5zbb7\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-image-registry-private-configuration\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-hub\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/afea1806-f99c-4639-af8b-93b4c36066c4-tmp\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv6wh\" (UniqueName: \"kubernetes.io/projected/c2b810cb-3b39-4054-87ec-7dea7c076ae2-kube-api-access-hv6wh\") pod \"managed-serviceaccount-addon-agent-6d7489bc99-kr9tn\" (UID: \"c2b810cb-3b39-4054-87ec-7dea7c076ae2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-bound-sa-token\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-certificates\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.376798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmxxz\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-kube-api-access-vmxxz\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.377244 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.377244 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c2b810cb-3b39-4054-87ec-7dea7c076ae2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6d7489bc99-kr9tn\" (UID: \"c2b810cb-3b39-4054-87ec-7dea7c076ae2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.377244 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.376912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fbf1244e-299c-4320-99ad-e305cdf1a83b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.477871 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.477827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd77993a-2dcc-45b2-bced-9f3ea48a6328-ca-trust-extracted\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.477871 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.477875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:58.478100 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.477895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldsd\" (UniqueName: \"kubernetes.io/projected/7b5d7aa9-dd9d-487f-844d-3f40b038a994-kube-api-access-5ldsd\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:58.478100 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.477924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fdfc31d-52a5-4228-aa8d-7f803085d57e-tmp-dir\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.478100 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-trusted-ca\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478339 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-installation-pull-secrets\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478339 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px4tr\" (UniqueName: \"kubernetes.io/projected/afea1806-f99c-4639-af8b-93b4c36066c4-kube-api-access-px4tr\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.478339 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbb7\" (UniqueName: \"kubernetes.io/projected/fbf1244e-299c-4320-99ad-e305cdf1a83b-kube-api-access-5zbb7\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.478339 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd77993a-2dcc-45b2-bced-9f3ea48a6328-ca-trust-extracted\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478339 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-image-registry-private-configuration\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-hub\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdfc31d-52a5-4228-aa8d-7f803085d57e-config-volume\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/afea1806-f99c-4639-af8b-93b4c36066c4-tmp\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv6wh\" (UniqueName: \"kubernetes.io/projected/c2b810cb-3b39-4054-87ec-7dea7c076ae2-kube-api-access-hv6wh\") pod \"managed-serviceaccount-addon-agent-6d7489bc99-kr9tn\" (UID: \"c2b810cb-3b39-4054-87ec-7dea7c076ae2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-bound-sa-token\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-certificates\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmxxz\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-kube-api-access-vmxxz\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c2b810cb-3b39-4054-87ec-7dea7c076ae2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6d7489bc99-kr9tn\" (UID: \"c2b810cb-3b39-4054-87ec-7dea7c076ae2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjzv\" (UniqueName: \"kubernetes.io/projected/5fdfc31d-52a5-4228-aa8d-7f803085d57e-kube-api-access-fcjzv\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fbf1244e-299c-4320-99ad-e305cdf1a83b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/afea1806-f99c-4639-af8b-93b4c36066c4-klusterlet-config\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.478900 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.478859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-ca\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.479283 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.478925 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:11:58.479283 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.478941 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:11:58.479283 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.479006 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:58.97898565 +0000 UTC m=+33.568362735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:11:58.480585 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.480527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fbf1244e-299c-4320-99ad-e305cdf1a83b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.480826 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.480804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/afea1806-f99c-4639-af8b-93b4c36066c4-tmp\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.480931 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.480824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-trusted-ca\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.480986 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.480962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-certificates\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.483858 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.483638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-installation-pull-secrets\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.483858 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.483681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/afea1806-f99c-4639-af8b-93b4c36066c4-klusterlet-config\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.483858 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.483682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c2b810cb-3b39-4054-87ec-7dea7c076ae2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6d7489bc99-kr9tn\" (UID: \"c2b810cb-3b39-4054-87ec-7dea7c076ae2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.484052 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.483926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.484382 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.484357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-ca\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.484724 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.484700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-hub\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.484881 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.484858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fbf1244e-299c-4320-99ad-e305cdf1a83b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.485274 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.485252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-image-registry-private-configuration\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.489427 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.489406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbb7\" (UniqueName: \"kubernetes.io/projected/fbf1244e-299c-4320-99ad-e305cdf1a83b-kube-api-access-5zbb7\") pod \"cluster-proxy-proxy-agent-55f78db797-w788w\" (UID: \"fbf1244e-299c-4320-99ad-e305cdf1a83b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.490001 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.489887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmxxz\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-kube-api-access-vmxxz\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.490001 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.489959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-bound-sa-token\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.490257 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.490211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4tr\" (UniqueName: \"kubernetes.io/projected/afea1806-f99c-4639-af8b-93b4c36066c4-kube-api-access-px4tr\") pod \"klusterlet-addon-workmgr-86885c9c89-crvx8\" (UID: \"afea1806-f99c-4639-af8b-93b4c36066c4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.491206 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.491186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv6wh\" (UniqueName: \"kubernetes.io/projected/c2b810cb-3b39-4054-87ec-7dea7c076ae2-kube-api-access-hv6wh\") pod \"managed-serviceaccount-addon-agent-6d7489bc99-kr9tn\" (UID: \"c2b810cb-3b39-4054-87ec-7dea7c076ae2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.579946 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.579848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:58.579946 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.579890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldsd\" (UniqueName: \"kubernetes.io/projected/7b5d7aa9-dd9d-487f-844d-3f40b038a994-kube-api-access-5ldsd\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:58.580201 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.579967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fdfc31d-52a5-4228-aa8d-7f803085d57e-tmp-dir\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.580201 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.580021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdfc31d-52a5-4228-aa8d-7f803085d57e-config-volume\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.580201 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.580032 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:11:58.580201 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.580096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjzv\" (UniqueName: \"kubernetes.io/projected/5fdfc31d-52a5-4228-aa8d-7f803085d57e-kube-api-access-fcjzv\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.580201 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.580113 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:59.080092782 +0000 UTC m=+33.669469863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:11:58.580201 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.580174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.580506 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.580338 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:11:58.580506 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.580381 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:11:59.080367732 +0000 UTC m=+33.669744811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:11:58.580506 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.580413 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fdfc31d-52a5-4228-aa8d-7f803085d57e-tmp-dir\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.580741 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.580712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdfc31d-52a5-4228-aa8d-7f803085d57e-config-volume\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.588078 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.588051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjzv\" (UniqueName: \"kubernetes.io/projected/5fdfc31d-52a5-4228-aa8d-7f803085d57e-kube-api-access-fcjzv\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:58.588314 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.588295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldsd\" (UniqueName: \"kubernetes.io/projected/7b5d7aa9-dd9d-487f-844d-3f40b038a994-kube-api-access-5ldsd\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:58.600212 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.600185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:11:58.606993 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.606973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" Apr 16 13:11:58.631943 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.631899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:11:58.902670 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.902492 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn"] Apr 16 13:11:58.905843 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.905811 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w"] Apr 16 13:11:58.906630 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.906607 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8"] Apr 16 13:11:58.961247 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:58.961210 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2b810cb_3b39_4054_87ec_7dea7c076ae2.slice/crio-0ffe6112097768a86d03b17f0996c1728f5d70ae6b904b59f74ea7336e333899 WatchSource:0}: Error finding container 0ffe6112097768a86d03b17f0996c1728f5d70ae6b904b59f74ea7336e333899: Status 404 returned error can't find the container with id 0ffe6112097768a86d03b17f0996c1728f5d70ae6b904b59f74ea7336e333899 Apr 16 13:11:58.961688 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:58.961584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbf1244e_299c_4320_99ad_e305cdf1a83b.slice/crio-ff9da1d7d98d977e6142173a505528c149aa477f436ccfe762653b9c939c83ef WatchSource:0}: Error finding container ff9da1d7d98d977e6142173a505528c149aa477f436ccfe762653b9c939c83ef: Status 404 returned error can't find the container with id ff9da1d7d98d977e6142173a505528c149aa477f436ccfe762653b9c939c83ef Apr 16 13:11:58.962737 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:11:58.962692 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafea1806_f99c_4639_af8b_93b4c36066c4.slice/crio-621f162519309d4d694802921c40ec0a75dde42e75c573e832237a28eac6ec2d WatchSource:0}: Error finding container 621f162519309d4d694802921c40ec0a75dde42e75c573e832237a28eac6ec2d: Status 404 returned error can't find the container with id 621f162519309d4d694802921c40ec0a75dde42e75c573e832237a28eac6ec2d Apr 16 13:11:58.985003 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:58.984978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:58.985165 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.985149 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:11:58.985222 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.985166 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:11:58.985222 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:58.985212 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:59.985196154 +0000 UTC m=+34.574573234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:11:59.085875 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.085850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:11:59.086433 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.085899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:11:59.086433 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.085979 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:11:59.086433 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.085986 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:11:59.086433 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.086029 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:00.086015527 +0000 UTC m=+34.675392593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:11:59.086433 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.086044 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:12:00.086037146 +0000 UTC m=+34.675414213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:11:59.211006 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.210967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" event={"ID":"c2b810cb-3b39-4054-87ec-7dea7c076ae2","Type":"ContainerStarted","Data":"0ffe6112097768a86d03b17f0996c1728f5d70ae6b904b59f74ea7336e333899"} Apr 16 13:11:59.211974 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.211942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" event={"ID":"fbf1244e-299c-4320-99ad-e305cdf1a83b","Type":"ContainerStarted","Data":"ff9da1d7d98d977e6142173a505528c149aa477f436ccfe762653b9c939c83ef"} Apr 16 13:11:59.212851 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.212828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" event={"ID":"afea1806-f99c-4639-af8b-93b4c36066c4","Type":"ContainerStarted","Data":"621f162519309d4d694802921c40ec0a75dde42e75c573e832237a28eac6ec2d"} Apr 16 13:11:59.215295 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.215276 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerStarted","Data":"57bf455f7c2318793b78254b6c6669b76b8d55225464f9ec5939d43e8a6853fb"} Apr 16 13:11:59.689983 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.689939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:11:59.690184 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.690110 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:59.690184 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.690147 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:59.690184 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.690161 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j8wk2 for pod openshift-network-diagnostics/network-check-target-h9klv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:59.690316 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.690215 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2 podName:75e624b2-8f5a-4782-b50e-326781f4d00a nodeName:}" failed. No retries permitted until 2026-04-16 13:12:31.690199393 +0000 UTC m=+66.279576472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-j8wk2" (UniqueName: "kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2") pod "network-check-target-h9klv" (UID: "75e624b2-8f5a-4782-b50e-326781f4d00a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:59.792329 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.791448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:11:59.792329 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.791689 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:59.792329 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.791756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:12:31.791738015 +0000 UTC m=+66.381115088 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:59.992973 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:11:59.992928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:11:59.993232 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.993134 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:11:59.993232 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.993152 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:11:59.993362 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:11:59.993243 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:01.993195339 +0000 UTC m=+36.582572411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:12:00.042867 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.042419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:12:00.042867 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.042658 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:12:00.044364 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.043552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:12:00.045911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.045409 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhs75\"" Apr 16 13:12:00.045911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.045552 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hkf74\"" Apr 16 13:12:00.045911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.045409 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:12:00.045911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.045614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:12:00.045911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.045723 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:12:00.045911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.045768 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:12:00.094391 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.094251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:12:00.095502 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:00.094676 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:00.095502 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:00.094743 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:12:02.094723554 +0000 UTC m=+36.684100626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:12:00.095502 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.094961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:12:00.095502 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:00.095093 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:00.095502 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:00.095161 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:02.095144936 +0000 UTC m=+36.684522005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:12:00.226200 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.226144 2575 generic.go:358] "Generic (PLEG): container finished" podID="8156b1b5-dde0-4575-9f54-ea6d2acf9495" containerID="57bf455f7c2318793b78254b6c6669b76b8d55225464f9ec5939d43e8a6853fb" exitCode=0 Apr 16 13:12:00.226360 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:00.226216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerDied","Data":"57bf455f7c2318793b78254b6c6669b76b8d55225464f9ec5939d43e8a6853fb"} Apr 16 13:12:01.233777 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:01.233742 2575 generic.go:358] "Generic (PLEG): container finished" podID="8156b1b5-dde0-4575-9f54-ea6d2acf9495" containerID="6ef7c3f141288b8d99473b06b38dfba05343c4da0a12b0c930f96c714f4f9fc7" exitCode=0 Apr 16 13:12:01.234256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:01.233826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerDied","Data":"6ef7c3f141288b8d99473b06b38dfba05343c4da0a12b0c930f96c714f4f9fc7"} Apr 16 13:12:02.015415 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:02.015361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:12:02.015594 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.015552 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:12:02.015594 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.015576 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:12:02.015698 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.015647 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:06.015625551 +0000 UTC m=+40.605002642 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:12:02.116685 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:02.116576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:12:02.116685 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:02.116663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:12:02.116871 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.116811 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:02.116871 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.116814 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:02.116957 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.116892 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:06.116871368 +0000 UTC m=+40.706248453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:12:02.116957 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:02.116912 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:12:06.116902 +0000 UTC m=+40.706279074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:12:05.242959 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.242927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" event={"ID":"c2b810cb-3b39-4054-87ec-7dea7c076ae2","Type":"ContainerStarted","Data":"a0455fe44eff2885d9b7bb91144e693d4527df927bd7915c372fe2fbe8e8e85c"} Apr 16 13:12:05.244249 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.244228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" event={"ID":"fbf1244e-299c-4320-99ad-e305cdf1a83b","Type":"ContainerStarted","Data":"7806609f945bf119bfaa203eb058e9e8d542f60b49594a350dd40259573c95cf"} Apr 16 13:12:05.245493 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.245469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" event={"ID":"afea1806-f99c-4639-af8b-93b4c36066c4","Type":"ContainerStarted","Data":"35acc92cdbcf278a65b1b3e081258d31d673c3b28c676605dfcc049c51cf85b5"} Apr 16 13:12:05.245684 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.245666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:12:05.247351 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.247334 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:12:05.248656 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.248639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-66xwt" event={"ID":"8156b1b5-dde0-4575-9f54-ea6d2acf9495","Type":"ContainerStarted","Data":"bf643360ecef7875830a514a359298d07e17a9dfc60e3d944674e64379950a08"} Apr 16 13:12:05.255796 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.255735 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" podStartSLOduration=16.260178993 podStartE2EDuration="22.255724651s" podCreationTimestamp="2026-04-16 13:11:43 +0000 UTC" firstStartedPulling="2026-04-16 13:11:58.972996684 +0000 UTC m=+33.562373754" lastFinishedPulling="2026-04-16 13:12:04.968542344 +0000 UTC m=+39.557919412" observedRunningTime="2026-04-16 13:12:05.255408207 +0000 UTC m=+39.844785320" watchObservedRunningTime="2026-04-16 13:12:05.255724651 +0000 UTC m=+39.845101733" Apr 16 13:12:05.273569 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.273534 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-66xwt" podStartSLOduration=8.966405458 podStartE2EDuration="39.2735229s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:11:28.688070442 +0000 UTC m=+3.277447513" lastFinishedPulling="2026-04-16 13:11:58.995187871 +0000 UTC m=+33.584564955" observedRunningTime="2026-04-16 13:12:05.272385348 +0000 UTC m=+39.861762450" watchObservedRunningTime="2026-04-16 13:12:05.2735229 +0000 UTC m=+39.862899987" Apr 16 13:12:05.284727 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:05.284685 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" podStartSLOduration=16.270372236 podStartE2EDuration="22.284674016s" podCreationTimestamp="2026-04-16 13:11:43 +0000 UTC" firstStartedPulling="2026-04-16 13:11:58.972777011 +0000 UTC m=+33.562154090" lastFinishedPulling="2026-04-16 13:12:04.987078792 +0000 UTC m=+39.576455870" observedRunningTime="2026-04-16 13:12:05.28440959 +0000 UTC m=+39.873786697" watchObservedRunningTime="2026-04-16 13:12:05.284674016 +0000 UTC m=+39.874051102" Apr 16 13:12:06.047620 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:06.047579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:12:06.047792 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.047721 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:12:06.047792 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.047735 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:12:06.047792 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.047785 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:14.047769945 +0000 UTC m=+48.637147013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:12:06.148648 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:06.148607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:12:06.148819 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:06.148691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:12:06.148819 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.148711 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:06.148819 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.148793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:12:14.148776564 +0000 UTC m=+48.738153630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:12:06.148819 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.148815 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:06.149009 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:06.148872 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:14.148855883 +0000 UTC m=+48.738232957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:12:07.763377 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:07.763339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:12:07.766873 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:07.766845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/454de3ce-a596-4780-b1b7-e2fe418de97e-original-pull-secret\") pod \"global-pull-secret-syncer-hqc54\" (UID: \"454de3ce-a596-4780-b1b7-e2fe418de97e\") " pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:12:07.875347 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:07.875298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hqc54" Apr 16 13:12:07.986998 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:07.986968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hqc54"] Apr 16 13:12:07.991651 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:12:07.991615 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod454de3ce_a596_4780_b1b7_e2fe418de97e.slice/crio-cf7b4ec4daf34faebf015a63bf7d5cbcef8e983de0e85b92ac6088e782e88180 WatchSource:0}: Error finding container cf7b4ec4daf34faebf015a63bf7d5cbcef8e983de0e85b92ac6088e782e88180: Status 404 returned error can't find the container with id cf7b4ec4daf34faebf015a63bf7d5cbcef8e983de0e85b92ac6088e782e88180 Apr 16 13:12:08.254874 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:08.254839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hqc54" event={"ID":"454de3ce-a596-4780-b1b7-e2fe418de97e","Type":"ContainerStarted","Data":"cf7b4ec4daf34faebf015a63bf7d5cbcef8e983de0e85b92ac6088e782e88180"} Apr 16 13:12:08.256628 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:08.256597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" event={"ID":"fbf1244e-299c-4320-99ad-e305cdf1a83b","Type":"ContainerStarted","Data":"63da4c46b3303581a36b0469998cc1432b958641cc2fa400bb23443a56737afa"} Apr 16 13:12:08.256628 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:08.256625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" event={"ID":"fbf1244e-299c-4320-99ad-e305cdf1a83b","Type":"ContainerStarted","Data":"bd2cbb7d2ab9b352a994eaefc29860e88eebe20e5eeb2862188aa16e4c0615c5"} Apr 16 13:12:08.273636 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:08.273596 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" podStartSLOduration=16.832605765 podStartE2EDuration="25.273582895s" podCreationTimestamp="2026-04-16 13:11:43 +0000 UTC" firstStartedPulling="2026-04-16 13:11:58.972673039 +0000 UTC m=+33.562050124" lastFinishedPulling="2026-04-16 13:12:07.413650188 +0000 UTC m=+42.003027254" observedRunningTime="2026-04-16 13:12:08.272415262 +0000 UTC m=+42.861792362" watchObservedRunningTime="2026-04-16 13:12:08.273582895 +0000 UTC m=+42.862959980" Apr 16 13:12:13.268576 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:13.268537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hqc54" event={"ID":"454de3ce-a596-4780-b1b7-e2fe418de97e","Type":"ContainerStarted","Data":"3a7fa9fd5c050845bf6d4b8dd4fad103bc40b89f9a43340aa7373ec9b7b500fb"} Apr 16 13:12:13.280388 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:13.280333 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hqc54" podStartSLOduration=33.886423657 podStartE2EDuration="38.280316348s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:12:07.993671604 +0000 UTC m=+42.583048670" lastFinishedPulling="2026-04-16 13:12:12.38756428 +0000 UTC m=+46.976941361" observedRunningTime="2026-04-16 13:12:13.280201552 +0000 UTC m=+47.869578631" watchObservedRunningTime="2026-04-16 13:12:13.280316348 +0000 UTC m=+47.869693444" Apr 16 13:12:14.118316 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:14.118282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:12:14.118498 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.118450 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:12:14.118498 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.118471 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:12:14.118567 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.118527 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:30.118511009 +0000 UTC m=+64.707888081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:12:14.218937 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:14.218902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:12:14.219094 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:14.218954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:12:14.219094 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.219071 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:14.219231 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.219076 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:14.219231 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.219186 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:12:30.219164255 +0000 UTC m=+64.808541335 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:12:14.219231 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:14.219209 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:30.219196787 +0000 UTC m=+64.808573853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:12:26.384899 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:26.384871 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jzvgp" Apr 16 13:12:30.139602 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:30.139562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:12:30.140017 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.139722 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:12:30.140017 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.139744 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:12:30.140017 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.139806 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:02.139790989 +0000 UTC m=+96.729168055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:12:30.240055 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:30.240007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:12:30.240055 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:30.240070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:12:30.240303 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.240199 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:30.240303 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.240221 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:30.240303 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.240276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:02.240260259 +0000 UTC m=+96.829637325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:12:30.240303 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:30.240290 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:13:02.240284055 +0000 UTC m=+96.829661120 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:12:31.750665 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.750620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:12:31.752754 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.752734 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:12:31.762703 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.762681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:12:31.774498 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.774471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8wk2\" (UniqueName: \"kubernetes.io/projected/75e624b2-8f5a-4782-b50e-326781f4d00a-kube-api-access-j8wk2\") pod \"network-check-target-h9klv\" (UID: \"75e624b2-8f5a-4782-b50e-326781f4d00a\") " pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:12:31.851211 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.851172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:12:31.853130 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.853097 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:12:31.861406 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:31.861385 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:12:31.861484 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:12:31.861447 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:13:35.86143143 +0000 UTC m=+130.450808496 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : secret "metrics-daemon-secret" not found Apr 16 13:12:31.865198 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.865179 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhs75\"" Apr 16 13:12:31.873985 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.873967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:12:31.984637 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:31.984609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h9klv"] Apr 16 13:12:31.987524 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:12:31.987500 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e624b2_8f5a_4782_b50e_326781f4d00a.slice/crio-b42909a74a2ea51f3509c53e9bf8f2f5cd51d06c94f8b8a920de31f24da99962 WatchSource:0}: Error finding container b42909a74a2ea51f3509c53e9bf8f2f5cd51d06c94f8b8a920de31f24da99962: Status 404 returned error can't find the container with id b42909a74a2ea51f3509c53e9bf8f2f5cd51d06c94f8b8a920de31f24da99962 Apr 16 13:12:32.318800 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:32.318756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h9klv" event={"ID":"75e624b2-8f5a-4782-b50e-326781f4d00a","Type":"ContainerStarted","Data":"b42909a74a2ea51f3509c53e9bf8f2f5cd51d06c94f8b8a920de31f24da99962"} Apr 16 13:12:35.327626 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:35.327533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h9klv" event={"ID":"75e624b2-8f5a-4782-b50e-326781f4d00a","Type":"ContainerStarted","Data":"de63724e5160689a75b96ebd999a3e95ebeffe5db9746fadebb4c4ada639947b"} Apr 16 13:12:35.328069 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:35.327681 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:12:35.340785 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:12:35.340738 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-h9klv" podStartSLOduration=66.29188869 podStartE2EDuration="1m9.340710157s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:12:31.989283015 +0000 UTC m=+66.578660081" lastFinishedPulling="2026-04-16 13:12:35.038104481 +0000 UTC m=+69.627481548" observedRunningTime="2026-04-16 13:12:35.340649184 +0000 UTC m=+69.930026293" watchObservedRunningTime="2026-04-16 13:12:35.340710157 +0000 UTC m=+69.930087245" Apr 16 13:13:02.188382 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:02.188343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:13:02.188841 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.188498 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:13:02.188841 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.188519 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7d7cbb65c5-z5fm8: secret "image-registry-tls" not found Apr 16 13:13:02.188841 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.188587 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls podName:bd77993a-2dcc-45b2-bced-9f3ea48a6328 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:06.188570407 +0000 UTC m=+160.777947473 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls") pod "image-registry-7d7cbb65c5-z5fm8" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328") : secret "image-registry-tls" not found Apr 16 13:13:02.289065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:02.289025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:13:02.289065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:02.289070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:13:02.289298 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.289181 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:13:02.289298 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.289262 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls podName:5fdfc31d-52a5-4228-aa8d-7f803085d57e nodeName:}" failed. No retries permitted until 2026-04-16 13:14:06.289244139 +0000 UTC m=+160.878621210 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls") pod "dns-default-msm5w" (UID: "5fdfc31d-52a5-4228-aa8d-7f803085d57e") : secret "dns-default-metrics-tls" not found Apr 16 13:13:02.289298 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.289190 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:13:02.289399 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:02.289316 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert podName:7b5d7aa9-dd9d-487f-844d-3f40b038a994 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:06.289302184 +0000 UTC m=+160.878679251 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert") pod "ingress-canary-9r95z" (UID: "7b5d7aa9-dd9d-487f-844d-3f40b038a994") : secret "canary-serving-cert" not found Apr 16 13:13:06.332256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:06.332223 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-h9klv" Apr 16 13:13:35.931961 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:35.931908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:13:35.932572 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:35.932085 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:13:35.932572 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:13:35.932204 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs podName:0865faea-916d-435f-88f5-d2b559f1d79a nodeName:}" failed. No retries permitted until 2026-04-16 13:15:37.932181688 +0000 UTC m=+252.521558754 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs") pod "network-metrics-daemon-9b59n" (UID: "0865faea-916d-435f-88f5-d2b559f1d79a") : secret "metrics-daemon-secret" not found Apr 16 13:13:52.428769 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:52.428738 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g4rk4_5e2c6e75-298e-4014-bf52-0dc9f276e559/dns-node-resolver/0.log" Apr 16 13:13:53.829710 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:13:53.829683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-psf9p_fd1864d2-5d1c-41a0-84d9-dd4835e795d5/node-ca/0.log" Apr 16 13:14:01.323262 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:14:01.323214 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" Apr 16 13:14:01.339071 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:14:01.339035 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9r95z" podUID="7b5d7aa9-dd9d-487f-844d-3f40b038a994" Apr 16 13:14:01.344249 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:14:01.344227 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-msm5w" podUID="5fdfc31d-52a5-4228-aa8d-7f803085d57e" Apr 16 13:14:01.522420 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:01.522392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:14:01.522420 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:01.522402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:14:01.522630 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:01.522402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-msm5w" Apr 16 13:14:03.082578 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:14:03.082533 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9b59n" podUID="0865faea-916d-435f-88f5-d2b559f1d79a" Apr 16 13:14:05.531804 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:05.531771 2575 generic.go:358] "Generic (PLEG): container finished" podID="afea1806-f99c-4639-af8b-93b4c36066c4" containerID="35acc92cdbcf278a65b1b3e081258d31d673c3b28c676605dfcc049c51cf85b5" exitCode=1 Apr 16 13:14:05.532231 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:05.531843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" event={"ID":"afea1806-f99c-4639-af8b-93b4c36066c4","Type":"ContainerDied","Data":"35acc92cdbcf278a65b1b3e081258d31d673c3b28c676605dfcc049c51cf85b5"} Apr 16 13:14:05.532305 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:05.532236 2575 scope.go:117] "RemoveContainer" containerID="35acc92cdbcf278a65b1b3e081258d31d673c3b28c676605dfcc049c51cf85b5" Apr 16 13:14:05.533232 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:05.533209 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2b810cb-3b39-4054-87ec-7dea7c076ae2" containerID="a0455fe44eff2885d9b7bb91144e693d4527df927bd7915c372fe2fbe8e8e85c" exitCode=255 Apr 16 13:14:05.533337 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:05.533265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" event={"ID":"c2b810cb-3b39-4054-87ec-7dea7c076ae2","Type":"ContainerDied","Data":"a0455fe44eff2885d9b7bb91144e693d4527df927bd7915c372fe2fbe8e8e85c"} Apr 16 13:14:05.533536 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:05.533488 2575 scope.go:117] "RemoveContainer" containerID="a0455fe44eff2885d9b7bb91144e693d4527df927bd7915c372fe2fbe8e8e85c" Apr 16 13:14:06.265427 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.265391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:14:06.267835 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.267804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"image-registry-7d7cbb65c5-z5fm8\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:14:06.325084 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.325058 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sxsq4\"" Apr 16 13:14:06.332850 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.332823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:14:06.366039 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.365989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:14:06.366229 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.366056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:14:06.368855 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.368832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fdfc31d-52a5-4228-aa8d-7f803085d57e-metrics-tls\") pod \"dns-default-msm5w\" (UID: \"5fdfc31d-52a5-4228-aa8d-7f803085d57e\") " pod="openshift-dns/dns-default-msm5w" Apr 16 13:14:06.369213 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.369181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b5d7aa9-dd9d-487f-844d-3f40b038a994-cert\") pod \"ingress-canary-9r95z\" (UID: \"7b5d7aa9-dd9d-487f-844d-3f40b038a994\") " pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:14:06.448005 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.447974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d7cbb65c5-z5fm8"] Apr 16 13:14:06.451009 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:14:06.450985 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd77993a_2dcc_45b2_bced_9f3ea48a6328.slice/crio-97c4ea72b1a074866d92c9a1f7ed04fe46c3a82a1402bce472430c6fb467f9bd WatchSource:0}: Error finding container 97c4ea72b1a074866d92c9a1f7ed04fe46c3a82a1402bce472430c6fb467f9bd: Status 404 returned error can't find the container with id 97c4ea72b1a074866d92c9a1f7ed04fe46c3a82a1402bce472430c6fb467f9bd Apr 16 13:14:06.536720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.536632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" event={"ID":"bd77993a-2dcc-45b2-bced-9f3ea48a6328","Type":"ContainerStarted","Data":"53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705"} Apr 16 13:14:06.536720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.536673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" event={"ID":"bd77993a-2dcc-45b2-bced-9f3ea48a6328","Type":"ContainerStarted","Data":"97c4ea72b1a074866d92c9a1f7ed04fe46c3a82a1402bce472430c6fb467f9bd"} Apr 16 13:14:06.536720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.536718 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:14:06.538304 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.538283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" event={"ID":"afea1806-f99c-4639-af8b-93b4c36066c4","Type":"ContainerStarted","Data":"4dd545861b6207659f41b1119762038df3834d363dfaa2293729de503d6cfa33"} Apr 16 13:14:06.538574 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.538551 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:14:06.539270 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.539234 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-86885c9c89-crvx8" Apr 16 13:14:06.539894 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.539875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6d7489bc99-kr9tn" event={"ID":"c2b810cb-3b39-4054-87ec-7dea7c076ae2","Type":"ContainerStarted","Data":"05c2580130b85a5be7ac4270b6974bd163f6b37c97da7c543558031ff0bdaf75"} Apr 16 13:14:06.555383 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.555339 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" podStartSLOduration=160.555325863 podStartE2EDuration="2m40.555325863s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:14:06.553876515 +0000 UTC m=+161.143253649" watchObservedRunningTime="2026-04-16 13:14:06.555325863 +0000 UTC m=+161.144702951" Apr 16 13:14:06.625491 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.625454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrfd8\"" Apr 16 13:14:06.625491 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.625493 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkkx5\"" Apr 16 13:14:06.633939 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.633910 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-msm5w" Apr 16 13:14:06.633939 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.633923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9r95z" Apr 16 13:14:06.759766 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.759722 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9r95z"] Apr 16 13:14:06.762513 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:14:06.762485 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b5d7aa9_dd9d_487f_844d_3f40b038a994.slice/crio-36060d3ee27ad0ba91237fdf29a735e064523dcefa0923cfe83900776a88bffe WatchSource:0}: Error finding container 36060d3ee27ad0ba91237fdf29a735e064523dcefa0923cfe83900776a88bffe: Status 404 returned error can't find the container with id 36060d3ee27ad0ba91237fdf29a735e064523dcefa0923cfe83900776a88bffe Apr 16 13:14:06.775050 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:06.775023 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-msm5w"] Apr 16 13:14:06.778277 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:14:06.778253 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fdfc31d_52a5_4228_aa8d_7f803085d57e.slice/crio-91f9784b6b6395b6ee4f460939ed22ba54dd63d54d6306d89ccc6c216b62e491 WatchSource:0}: Error finding container 91f9784b6b6395b6ee4f460939ed22ba54dd63d54d6306d89ccc6c216b62e491: Status 404 returned error can't find the container with id 91f9784b6b6395b6ee4f460939ed22ba54dd63d54d6306d89ccc6c216b62e491 Apr 16 13:14:07.543736 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:07.543696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-msm5w" event={"ID":"5fdfc31d-52a5-4228-aa8d-7f803085d57e","Type":"ContainerStarted","Data":"91f9784b6b6395b6ee4f460939ed22ba54dd63d54d6306d89ccc6c216b62e491"} Apr 16 13:14:07.544760 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:07.544727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9r95z" event={"ID":"7b5d7aa9-dd9d-487f-844d-3f40b038a994","Type":"ContainerStarted","Data":"36060d3ee27ad0ba91237fdf29a735e064523dcefa0923cfe83900776a88bffe"} Apr 16 13:14:09.551088 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:09.551053 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9r95z" event={"ID":"7b5d7aa9-dd9d-487f-844d-3f40b038a994","Type":"ContainerStarted","Data":"c955426b5138a831ee7ee5dbf3201578334b7d760af1c2bace670cee46268fbd"} Apr 16 13:14:09.552546 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:09.552519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-msm5w" event={"ID":"5fdfc31d-52a5-4228-aa8d-7f803085d57e","Type":"ContainerStarted","Data":"6d69f00e53f8c9972fb6ae8bfe94a3ba739fbd72ebaf980077fa03846dddb9f5"} Apr 16 13:14:09.552631 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:09.552555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-msm5w" event={"ID":"5fdfc31d-52a5-4228-aa8d-7f803085d57e","Type":"ContainerStarted","Data":"8e36ba4cd576f6dbd495fc1fc4de4b7d8d91a7c71eb4e22ed2eff13732b2cd1a"} Apr 16 13:14:09.552688 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:09.552669 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-msm5w" Apr 16 13:14:09.564197 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:09.564160 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9r95z" podStartSLOduration=129.54258112 podStartE2EDuration="2m11.56414484s" podCreationTimestamp="2026-04-16 13:11:58 +0000 UTC" firstStartedPulling="2026-04-16 13:14:06.764317695 +0000 UTC m=+161.353694760" lastFinishedPulling="2026-04-16 13:14:08.785881409 +0000 UTC m=+163.375258480" observedRunningTime="2026-04-16 13:14:09.563797044 +0000 UTC m=+164.153174133" watchObservedRunningTime="2026-04-16 13:14:09.56414484 +0000 UTC m=+164.153521926" Apr 16 13:14:09.578571 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:09.578525 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-msm5w" podStartSLOduration=129.576194824 podStartE2EDuration="2m11.578512373s" podCreationTimestamp="2026-04-16 13:11:58 +0000 UTC" firstStartedPulling="2026-04-16 13:14:06.780057056 +0000 UTC m=+161.369434125" lastFinishedPulling="2026-04-16 13:14:08.782374606 +0000 UTC m=+163.371751674" observedRunningTime="2026-04-16 13:14:09.577255574 +0000 UTC m=+164.166632661" watchObservedRunningTime="2026-04-16 13:14:09.578512373 +0000 UTC m=+164.167889460" Apr 16 13:14:13.972983 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.972947 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2b8cf"] Apr 16 13:14:13.978132 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.978094 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:13.980588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.980561 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 13:14:13.980588 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.980575 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 13:14:13.980791 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.980569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-mhnxk\"" Apr 16 13:14:13.980791 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.980570 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 13:14:13.983828 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.983809 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 13:14:13.986264 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:13.986242 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2b8cf"] Apr 16 13:14:14.028206 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.028174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf4zh\" (UniqueName: \"kubernetes.io/projected/ba1775da-40a0-4aa2-bb2e-895725999757-kube-api-access-jf4zh\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.028362 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.028228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ba1775da-40a0-4aa2-bb2e-895725999757-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.028362 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.028261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ba1775da-40a0-4aa2-bb2e-895725999757-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.028362 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.028307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ba1775da-40a0-4aa2-bb2e-895725999757-crio-socket\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.028362 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.028331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ba1775da-40a0-4aa2-bb2e-895725999757-data-volume\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129238 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ba1775da-40a0-4aa2-bb2e-895725999757-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129425 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ba1775da-40a0-4aa2-bb2e-895725999757-crio-socket\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129425 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ba1775da-40a0-4aa2-bb2e-895725999757-data-volume\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129425 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf4zh\" (UniqueName: \"kubernetes.io/projected/ba1775da-40a0-4aa2-bb2e-895725999757-kube-api-access-jf4zh\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129425 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ba1775da-40a0-4aa2-bb2e-895725999757-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129425 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ba1775da-40a0-4aa2-bb2e-895725999757-crio-socket\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129725 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ba1775da-40a0-4aa2-bb2e-895725999757-data-volume\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.129877 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.129859 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ba1775da-40a0-4aa2-bb2e-895725999757-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.131471 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.131453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ba1775da-40a0-4aa2-bb2e-895725999757-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.139988 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.139968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf4zh\" (UniqueName: \"kubernetes.io/projected/ba1775da-40a0-4aa2-bb2e-895725999757-kube-api-access-jf4zh\") pod \"insights-runtime-extractor-2b8cf\" (UID: \"ba1775da-40a0-4aa2-bb2e-895725999757\") " pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.287519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.287423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2b8cf" Apr 16 13:14:14.409081 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.409051 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2b8cf"] Apr 16 13:14:14.412261 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:14:14.412233 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1775da_40a0_4aa2_bb2e_895725999757.slice/crio-5b67c3f5673d650d2c01eac319f7c25f8a0ea95df309d0bddb38b575d9d6908b WatchSource:0}: Error finding container 5b67c3f5673d650d2c01eac319f7c25f8a0ea95df309d0bddb38b575d9d6908b: Status 404 returned error can't find the container with id 5b67c3f5673d650d2c01eac319f7c25f8a0ea95df309d0bddb38b575d9d6908b Apr 16 13:14:14.566315 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.566233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b8cf" event={"ID":"ba1775da-40a0-4aa2-bb2e-895725999757","Type":"ContainerStarted","Data":"67c42d03c30329a28789336ec8e50c449694fdce82ebdcd4d77870ab4338f5a5"} Apr 16 13:14:14.566315 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:14.566268 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b8cf" event={"ID":"ba1775da-40a0-4aa2-bb2e-895725999757","Type":"ContainerStarted","Data":"5b67c3f5673d650d2c01eac319f7c25f8a0ea95df309d0bddb38b575d9d6908b"} Apr 16 13:14:15.571798 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:15.571753 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b8cf" event={"ID":"ba1775da-40a0-4aa2-bb2e-895725999757","Type":"ContainerStarted","Data":"0b175d06927b58f1d005da5162571d319d05572a9d537c035f11e50ce61b20e2"} Apr 16 13:14:17.578261 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:17.578230 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b8cf" event={"ID":"ba1775da-40a0-4aa2-bb2e-895725999757","Type":"ContainerStarted","Data":"a84938428b72a07b5797e0399ac8debbd12b1d4009e7f698fb52fcca478733a0"} Apr 16 13:14:17.593640 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:17.593591 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2b8cf" podStartSLOduration=2.507000706 podStartE2EDuration="4.593574645s" podCreationTimestamp="2026-04-16 13:14:13 +0000 UTC" firstStartedPulling="2026-04-16 13:14:14.469420648 +0000 UTC m=+169.058797713" lastFinishedPulling="2026-04-16 13:14:16.555994586 +0000 UTC m=+171.145371652" observedRunningTime="2026-04-16 13:14:17.592770626 +0000 UTC m=+172.182147732" watchObservedRunningTime="2026-04-16 13:14:17.593574645 +0000 UTC m=+172.182951734" Apr 16 13:14:18.042950 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:18.042907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:14:19.557698 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:19.557669 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-msm5w" Apr 16 13:14:26.337013 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:26.336975 2575 patch_prober.go:28] interesting pod/image-registry-7d7cbb65c5-z5fm8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 13:14:26.337413 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:26.337038 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 13:14:27.548781 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:27.548754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:14:30.511807 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.511770 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hgwk4"] Apr 16 13:14:30.515803 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.515787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.517590 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.517569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 13:14:30.518017 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.518001 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 13:14:30.518154 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.518138 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 13:14:30.518312 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.518294 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 13:14:30.518485 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.518449 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 13:14:30.518742 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.518724 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2q4p2\"" Apr 16 13:14:30.518834 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.518753 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 13:14:30.551582 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551548 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-wtmp\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-sys\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-textfile\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551720 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-tls\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551849 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551849 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-root\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551849 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4wn\" (UniqueName: \"kubernetes.io/projected/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-kube-api-access-ws4wn\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551849 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-metrics-client-ca\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.551960 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.551881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-accelerators-collector-config\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.652956 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.652918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-tls\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.652956 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.652952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.652974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-root\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.652998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4wn\" (UniqueName: \"kubernetes.io/projected/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-kube-api-access-ws4wn\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-metrics-client-ca\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-root\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-accelerators-collector-config\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:14:30.653060 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-wtmp\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:14:30.653162 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-tls podName:5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c nodeName:}" failed. No retries permitted until 2026-04-16 13:14:31.153140652 +0000 UTC m=+185.742517723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-tls") pod "node-exporter-hgwk4" (UID: "5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c") : secret "node-exporter-tls" not found Apr 16 13:14:30.653210 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-sys\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653673 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-textfile\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653673 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-sys\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653673 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-wtmp\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653673 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-textfile\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653868 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-metrics-client-ca\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.653868 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.653729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-accelerators-collector-config\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.655498 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.655481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:30.671055 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:30.671029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4wn\" (UniqueName: \"kubernetes.io/projected/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-kube-api-access-ws4wn\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:31.156625 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:31.156589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-tls\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:31.158957 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:31.158932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c-node-exporter-tls\") pod \"node-exporter-hgwk4\" (UID: \"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c\") " pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:31.424929 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:31.424846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hgwk4" Apr 16 13:14:31.433407 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:14:31.433376 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c5fd0e8_bce8_4b35_8e74_ac10bf55c92c.slice/crio-e90c325ef11d20ceb19cfdfdd19e759a6cb397ee1794496d0519228274962c36 WatchSource:0}: Error finding container e90c325ef11d20ceb19cfdfdd19e759a6cb397ee1794496d0519228274962c36: Status 404 returned error can't find the container with id e90c325ef11d20ceb19cfdfdd19e759a6cb397ee1794496d0519228274962c36 Apr 16 13:14:31.614203 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:31.614167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hgwk4" event={"ID":"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c","Type":"ContainerStarted","Data":"e90c325ef11d20ceb19cfdfdd19e759a6cb397ee1794496d0519228274962c36"} Apr 16 13:14:32.618242 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:32.618205 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c" containerID="4442fbd9c37e4336723a4796c678ddbccfda2f16d47d8285eac3b5b584589228" exitCode=0 Apr 16 13:14:32.618630 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:32.618290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hgwk4" event={"ID":"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c","Type":"ContainerDied","Data":"4442fbd9c37e4336723a4796c678ddbccfda2f16d47d8285eac3b5b584589228"} Apr 16 13:14:33.622491 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:33.622449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hgwk4" event={"ID":"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c","Type":"ContainerStarted","Data":"adca579a1c5ee6c815ca1fb97ddcaa9cbad02ebc6708b757fb709b73d9fdcf68"} Apr 16 13:14:33.622491 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:33.622494 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hgwk4" event={"ID":"5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c","Type":"ContainerStarted","Data":"40a59ddb18683e8b882fbc558be2f3559b3a4c10d5c8ad5dad7bed85d47203a7"} Apr 16 13:14:33.640748 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:33.640693 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hgwk4" podStartSLOduration=2.966918896 podStartE2EDuration="3.640679687s" podCreationTimestamp="2026-04-16 13:14:30 +0000 UTC" firstStartedPulling="2026-04-16 13:14:31.435303489 +0000 UTC m=+186.024680578" lastFinishedPulling="2026-04-16 13:14:32.109064303 +0000 UTC m=+186.698441369" observedRunningTime="2026-04-16 13:14:33.63927985 +0000 UTC m=+188.228656939" watchObservedRunningTime="2026-04-16 13:14:33.640679687 +0000 UTC m=+188.230056774" Apr 16 13:14:36.278682 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:36.278646 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7d7cbb65c5-z5fm8"] Apr 16 13:14:48.601098 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:48.601053 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" podUID="fbf1244e-299c-4320-99ad-e305cdf1a83b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 13:14:58.601439 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:14:58.601400 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" podUID="fbf1244e-299c-4320-99ad-e305cdf1a83b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 13:15:01.297465 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.297424 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" containerName="registry" containerID="cri-o://53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705" gracePeriod=30 Apr 16 13:15:01.525108 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.525081 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:15:01.690059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.689973 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" containerID="53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705" exitCode=0 Apr 16 13:15:01.690059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.690017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" event={"ID":"bd77993a-2dcc-45b2-bced-9f3ea48a6328","Type":"ContainerDied","Data":"53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705"} Apr 16 13:15:01.690059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.690032 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" Apr 16 13:15:01.690059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.690039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d7cbb65c5-z5fm8" event={"ID":"bd77993a-2dcc-45b2-bced-9f3ea48a6328","Type":"ContainerDied","Data":"97c4ea72b1a074866d92c9a1f7ed04fe46c3a82a1402bce472430c6fb467f9bd"} Apr 16 13:15:01.690059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.690055 2575 scope.go:117] "RemoveContainer" containerID="53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705" Apr 16 13:15:01.696894 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.696871 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-installation-pull-secrets\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697015 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.696914 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-trusted-ca\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697015 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.696954 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-image-registry-private-configuration\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697015 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.696984 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697242 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697020 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmxxz\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-kube-api-access-vmxxz\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697242 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697050 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-bound-sa-token\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697242 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697146 2575 scope.go:117] "RemoveContainer" containerID="53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705" Apr 16 13:15:01.697394 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697360 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-certificates\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697447 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697387 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:01.697447 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697410 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd77993a-2dcc-45b2-bced-9f3ea48a6328-ca-trust-extracted\") pod \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\" (UID: \"bd77993a-2dcc-45b2-bced-9f3ea48a6328\") " Apr 16 13:15:01.697665 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697643 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-trusted-ca\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.697939 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.697884 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:01.698408 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:15:01.698370 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705\": container with ID starting with 53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705 not found: ID does not exist" containerID="53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705" Apr 16 13:15:01.698512 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.698410 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705"} err="failed to get container status \"53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705\": rpc error: code = NotFound desc = could not find container \"53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705\": container with ID starting with 53f5b7279a0817aacd824cb4aef16fdea4d0013aad73ef558a8440b1399a4705 not found: ID does not exist" Apr 16 13:15:01.699669 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.699645 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:01.699771 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.699746 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:01.699836 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.699790 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:01.699836 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.699820 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:01.699921 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.699895 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-kube-api-access-vmxxz" (OuterVolumeSpecName: "kube-api-access-vmxxz") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "kube-api-access-vmxxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:01.705931 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.705908 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd77993a-2dcc-45b2-bced-9f3ea48a6328-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bd77993a-2dcc-45b2-bced-9f3ea48a6328" (UID: "bd77993a-2dcc-45b2-bced-9f3ea48a6328"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:15:01.798188 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798152 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vmxxz\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-kube-api-access-vmxxz\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.798188 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798180 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-bound-sa-token\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.798188 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798193 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-certificates\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.798408 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798202 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bd77993a-2dcc-45b2-bced-9f3ea48a6328-ca-trust-extracted\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.798408 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798211 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-installation-pull-secrets\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.798408 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798222 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bd77993a-2dcc-45b2-bced-9f3ea48a6328-image-registry-private-configuration\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:01.798408 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:01.798233 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bd77993a-2dcc-45b2-bced-9f3ea48a6328-registry-tls\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:15:02.009580 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:02.009550 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7d7cbb65c5-z5fm8"] Apr 16 13:15:02.012538 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:02.012512 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7d7cbb65c5-z5fm8"] Apr 16 13:15:02.046364 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:02.046340 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" path="/var/lib/kubelet/pods/bd77993a-2dcc-45b2-bced-9f3ea48a6328/volumes" Apr 16 13:15:08.601751 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:08.601709 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" podUID="fbf1244e-299c-4320-99ad-e305cdf1a83b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 13:15:08.602165 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:08.601780 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" Apr 16 13:15:08.602301 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:08.602256 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"63da4c46b3303581a36b0469998cc1432b958641cc2fa400bb23443a56737afa"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 13:15:08.602338 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:08.602323 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" podUID="fbf1244e-299c-4320-99ad-e305cdf1a83b" containerName="service-proxy" containerID="cri-o://63da4c46b3303581a36b0469998cc1432b958641cc2fa400bb23443a56737afa" gracePeriod=30 Apr 16 13:15:09.714314 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:09.714282 2575 generic.go:358] "Generic (PLEG): container finished" podID="fbf1244e-299c-4320-99ad-e305cdf1a83b" containerID="63da4c46b3303581a36b0469998cc1432b958641cc2fa400bb23443a56737afa" exitCode=2 Apr 16 13:15:09.714695 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:09.714344 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" event={"ID":"fbf1244e-299c-4320-99ad-e305cdf1a83b","Type":"ContainerDied","Data":"63da4c46b3303581a36b0469998cc1432b958641cc2fa400bb23443a56737afa"} Apr 16 13:15:09.714695 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:09.714384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55f78db797-w788w" event={"ID":"fbf1244e-299c-4320-99ad-e305cdf1a83b","Type":"ContainerStarted","Data":"2ec3156bb52f732731ded160ea4ae8bc2cfa8ee7f427dc73c34005ed722ce7c2"} Apr 16 13:15:37.950934 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:37.950893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:15:37.953276 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:37.953251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0865faea-916d-435f-88f5-d2b559f1d79a-metrics-certs\") pod \"network-metrics-daemon-9b59n\" (UID: \"0865faea-916d-435f-88f5-d2b559f1d79a\") " pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:15:38.146354 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:38.146325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hkf74\"" Apr 16 13:15:38.155080 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:38.155040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9b59n" Apr 16 13:15:38.269516 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:38.269485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9b59n"] Apr 16 13:15:38.272549 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:15:38.272503 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0865faea_916d_435f_88f5_d2b559f1d79a.slice/crio-af60eae12488044038aff949c02e1d1ac60806da9ee5ea6f35ed66985b813670 WatchSource:0}: Error finding container af60eae12488044038aff949c02e1d1ac60806da9ee5ea6f35ed66985b813670: Status 404 returned error can't find the container with id af60eae12488044038aff949c02e1d1ac60806da9ee5ea6f35ed66985b813670 Apr 16 13:15:38.796653 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:38.796614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9b59n" event={"ID":"0865faea-916d-435f-88f5-d2b559f1d79a","Type":"ContainerStarted","Data":"af60eae12488044038aff949c02e1d1ac60806da9ee5ea6f35ed66985b813670"} Apr 16 13:15:39.803302 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:39.803267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9b59n" event={"ID":"0865faea-916d-435f-88f5-d2b559f1d79a","Type":"ContainerStarted","Data":"644f780233989436c9498fe978e91992ebac15187e585bfb3a82df34f617f0a1"} Apr 16 13:15:39.803302 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:39.803305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9b59n" event={"ID":"0865faea-916d-435f-88f5-d2b559f1d79a","Type":"ContainerStarted","Data":"bc84ded963e731789704220785b2780042c58582967a603c0e8d91948c1712a6"} Apr 16 13:15:39.819067 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:15:39.819014 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9b59n" podStartSLOduration=252.658136957 podStartE2EDuration="4m13.818998284s" podCreationTimestamp="2026-04-16 13:11:26 +0000 UTC" firstStartedPulling="2026-04-16 13:15:38.27433841 +0000 UTC m=+252.863715489" lastFinishedPulling="2026-04-16 13:15:39.435199738 +0000 UTC m=+254.024576816" observedRunningTime="2026-04-16 13:15:39.816993801 +0000 UTC m=+254.406370901" watchObservedRunningTime="2026-04-16 13:15:39.818998284 +0000 UTC m=+254.408375371" Apr 16 13:16:25.918910 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:16:25.918874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:16:25.918910 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:16:25.918899 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:16:25.921763 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:16:25.921745 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 13:17:28.125275 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.125239 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-gshj5"] Apr 16 13:17:28.125800 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.125543 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" containerName="registry" Apr 16 13:17:28.125800 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.125577 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" containerName="registry" Apr 16 13:17:28.125800 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.125649 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd77993a-2dcc-45b2-bced-9f3ea48a6328" containerName="registry" Apr 16 13:17:28.127631 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.127612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.129602 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.129577 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 13:17:28.129717 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.129577 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 13:17:28.130088 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.130069 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-sp72p\"" Apr 16 13:17:28.134634 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.134191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-gshj5"] Apr 16 13:17:28.196619 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.196580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgd7\" (UniqueName: \"kubernetes.io/projected/b0befe4e-8da3-490c-bce5-1c28bac8ab62-kube-api-access-thgd7\") pod \"cert-manager-759f64656b-gshj5\" (UID: \"b0befe4e-8da3-490c-bce5-1c28bac8ab62\") " pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.196619 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.196621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0befe4e-8da3-490c-bce5-1c28bac8ab62-bound-sa-token\") pod \"cert-manager-759f64656b-gshj5\" (UID: \"b0befe4e-8da3-490c-bce5-1c28bac8ab62\") " pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.297560 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.297522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thgd7\" (UniqueName: \"kubernetes.io/projected/b0befe4e-8da3-490c-bce5-1c28bac8ab62-kube-api-access-thgd7\") pod \"cert-manager-759f64656b-gshj5\" (UID: \"b0befe4e-8da3-490c-bce5-1c28bac8ab62\") " pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.297560 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.297567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0befe4e-8da3-490c-bce5-1c28bac8ab62-bound-sa-token\") pod \"cert-manager-759f64656b-gshj5\" (UID: \"b0befe4e-8da3-490c-bce5-1c28bac8ab62\") " pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.305029 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.304991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0befe4e-8da3-490c-bce5-1c28bac8ab62-bound-sa-token\") pod \"cert-manager-759f64656b-gshj5\" (UID: \"b0befe4e-8da3-490c-bce5-1c28bac8ab62\") " pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.305182 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.305057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgd7\" (UniqueName: \"kubernetes.io/projected/b0befe4e-8da3-490c-bce5-1c28bac8ab62-kube-api-access-thgd7\") pod \"cert-manager-759f64656b-gshj5\" (UID: \"b0befe4e-8da3-490c-bce5-1c28bac8ab62\") " pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.437736 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.437651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-gshj5" Apr 16 13:17:28.552579 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.552545 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-gshj5"] Apr 16 13:17:28.555717 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:17:28.555676 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0befe4e_8da3_490c_bce5_1c28bac8ab62.slice/crio-22accc5278f2cf7f53b12b2db55d4ad182652f6019ae878f03d957212d4e1d02 WatchSource:0}: Error finding container 22accc5278f2cf7f53b12b2db55d4ad182652f6019ae878f03d957212d4e1d02: Status 404 returned error can't find the container with id 22accc5278f2cf7f53b12b2db55d4ad182652f6019ae878f03d957212d4e1d02 Apr 16 13:17:28.557528 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:28.557511 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:17:29.067516 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:29.067479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-gshj5" event={"ID":"b0befe4e-8da3-490c-bce5-1c28bac8ab62","Type":"ContainerStarted","Data":"22accc5278f2cf7f53b12b2db55d4ad182652f6019ae878f03d957212d4e1d02"} Apr 16 13:17:33.080630 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:33.080594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-gshj5" event={"ID":"b0befe4e-8da3-490c-bce5-1c28bac8ab62","Type":"ContainerStarted","Data":"91143ed0a1a998559fce2ac4cf5d297f007aff3ea5fbf08eeac1adf3bcde0967"} Apr 16 13:17:33.094096 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:33.094046 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-gshj5" podStartSLOduration=0.997312792 podStartE2EDuration="5.094028052s" podCreationTimestamp="2026-04-16 13:17:28 +0000 UTC" firstStartedPulling="2026-04-16 13:17:28.557693464 +0000 UTC m=+363.147070540" lastFinishedPulling="2026-04-16 13:17:32.654408735 +0000 UTC m=+367.243785800" observedRunningTime="2026-04-16 13:17:33.09304079 +0000 UTC m=+367.682417898" watchObservedRunningTime="2026-04-16 13:17:33.094028052 +0000 UTC m=+367.683405140" Apr 16 13:17:46.912706 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.912671 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2"] Apr 16 13:17:46.914802 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.914779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:46.917352 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.917319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 13:17:46.917492 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.917370 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 13:17:46.917492 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.917320 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 13:17:46.917492 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.917455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5gs69\"" Apr 16 13:17:46.917492 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.917470 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 13:17:46.930187 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:46.930159 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2"] Apr 16 13:17:47.027603 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.027563 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e85e7df6-cc76-468f-83fd-9907ddde903f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.027603 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.027606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e85e7df6-cc76-468f-83fd-9907ddde903f-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.027815 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.027706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvvnl\" (UniqueName: \"kubernetes.io/projected/e85e7df6-cc76-468f-83fd-9907ddde903f-kube-api-access-jvvnl\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.128746 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.128716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvvnl\" (UniqueName: \"kubernetes.io/projected/e85e7df6-cc76-468f-83fd-9907ddde903f-kube-api-access-jvvnl\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.128903 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.128754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e85e7df6-cc76-468f-83fd-9907ddde903f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.128903 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.128777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e85e7df6-cc76-468f-83fd-9907ddde903f-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.131268 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.131244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e85e7df6-cc76-468f-83fd-9907ddde903f-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.131389 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.131294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e85e7df6-cc76-468f-83fd-9907ddde903f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.137066 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.137038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvvnl\" (UniqueName: \"kubernetes.io/projected/e85e7df6-cc76-468f-83fd-9907ddde903f-kube-api-access-jvvnl\") pod \"opendatahub-operator-controller-manager-5889847794-7ltx2\" (UID: \"e85e7df6-cc76-468f-83fd-9907ddde903f\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.224536 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.224507 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:47.344139 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:47.344095 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2"] Apr 16 13:17:47.346891 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:17:47.346865 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode85e7df6_cc76_468f_83fd_9907ddde903f.slice/crio-237b17c323b30eba43e3137b3adb1155a12009530f7f9cf8addf726216df2ccf WatchSource:0}: Error finding container 237b17c323b30eba43e3137b3adb1155a12009530f7f9cf8addf726216df2ccf: Status 404 returned error can't find the container with id 237b17c323b30eba43e3137b3adb1155a12009530f7f9cf8addf726216df2ccf Apr 16 13:17:48.122915 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:48.122875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" event={"ID":"e85e7df6-cc76-468f-83fd-9907ddde903f","Type":"ContainerStarted","Data":"237b17c323b30eba43e3137b3adb1155a12009530f7f9cf8addf726216df2ccf"} Apr 16 13:17:50.130776 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:50.130680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" event={"ID":"e85e7df6-cc76-468f-83fd-9907ddde903f","Type":"ContainerStarted","Data":"dc8ad3c3729db6bc790e3f7690be0d9b71e07666991c4cb79e5af6301176824b"} Apr 16 13:17:50.131194 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:50.130917 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:17:50.150807 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:17:50.150753 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" podStartSLOduration=1.750671885 podStartE2EDuration="4.150737031s" podCreationTimestamp="2026-04-16 13:17:46 +0000 UTC" firstStartedPulling="2026-04-16 13:17:47.349139092 +0000 UTC m=+381.938516158" lastFinishedPulling="2026-04-16 13:17:49.749204238 +0000 UTC m=+384.338581304" observedRunningTime="2026-04-16 13:17:50.149449137 +0000 UTC m=+384.738826271" watchObservedRunningTime="2026-04-16 13:17:50.150737031 +0000 UTC m=+384.740114118" Apr 16 13:18:01.135987 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:01.135952 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-7ltx2" Apr 16 13:18:14.314713 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.314676 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc"] Apr 16 13:18:14.316683 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.316665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.318752 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.318729 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-4bnc5\"" Apr 16 13:18:14.320712 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.319606 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 13:18:14.320712 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.319654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 13:18:14.320712 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.319660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 13:18:14.320712 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.319804 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 13:18:14.326265 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.326233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74e58751-2715-4b52-b472-4dc7b18630f7-tmp\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.326399 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.326286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74e58751-2715-4b52-b472-4dc7b18630f7-tls-certs\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.326399 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.326317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9jg\" (UniqueName: \"kubernetes.io/projected/74e58751-2715-4b52-b472-4dc7b18630f7-kube-api-access-7t9jg\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.326835 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.326766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc"] Apr 16 13:18:14.426921 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.426872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74e58751-2715-4b52-b472-4dc7b18630f7-tmp\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.426921 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.426922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74e58751-2715-4b52-b472-4dc7b18630f7-tls-certs\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.427220 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.426942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9jg\" (UniqueName: \"kubernetes.io/projected/74e58751-2715-4b52-b472-4dc7b18630f7-kube-api-access-7t9jg\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.429146 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.429108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74e58751-2715-4b52-b472-4dc7b18630f7-tmp\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.429413 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.429395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/74e58751-2715-4b52-b472-4dc7b18630f7-tls-certs\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.435813 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.435782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9jg\" (UniqueName: \"kubernetes.io/projected/74e58751-2715-4b52-b472-4dc7b18630f7-kube-api-access-7t9jg\") pod \"kube-auth-proxy-7485ccd7bf-5qzpc\" (UID: \"74e58751-2715-4b52-b472-4dc7b18630f7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.629828 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.629743 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" Apr 16 13:18:14.749253 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:14.749229 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc"] Apr 16 13:18:14.751909 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:18:14.751882 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e58751_2715_4b52_b472_4dc7b18630f7.slice/crio-54c5e8575c6ae6057e9d77402a07014f83573d6f669e8a3fd59b2bc5fe72adc3 WatchSource:0}: Error finding container 54c5e8575c6ae6057e9d77402a07014f83573d6f669e8a3fd59b2bc5fe72adc3: Status 404 returned error can't find the container with id 54c5e8575c6ae6057e9d77402a07014f83573d6f669e8a3fd59b2bc5fe72adc3 Apr 16 13:18:15.201166 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:15.201109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" event={"ID":"74e58751-2715-4b52-b472-4dc7b18630f7","Type":"ContainerStarted","Data":"54c5e8575c6ae6057e9d77402a07014f83573d6f669e8a3fd59b2bc5fe72adc3"} Apr 16 13:18:18.211095 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.211013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" event={"ID":"74e58751-2715-4b52-b472-4dc7b18630f7","Type":"ContainerStarted","Data":"de8363027a19a2575a1e5f21ab8f5925ccf6143c47109eddfa63b37259997d72"} Apr 16 13:18:18.236430 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.236377 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-5qzpc" podStartSLOduration=1.059427583 podStartE2EDuration="4.236359885s" podCreationTimestamp="2026-04-16 13:18:14 +0000 UTC" firstStartedPulling="2026-04-16 13:18:14.75424165 +0000 UTC m=+409.343618730" lastFinishedPulling="2026-04-16 13:18:17.931173967 +0000 UTC m=+412.520551032" observedRunningTime="2026-04-16 13:18:18.231058173 +0000 UTC m=+412.820435274" watchObservedRunningTime="2026-04-16 13:18:18.236359885 +0000 UTC m=+412.825736975" Apr 16 13:18:18.500813 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.500777 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7mwdl"] Apr 16 13:18:18.507319 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.507288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:18.509543 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.509504 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 13:18:18.509680 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.509556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-jw68q\"" Apr 16 13:18:18.510578 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.510552 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7mwdl"] Apr 16 13:18:18.560475 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.560438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4t9\" (UniqueName: \"kubernetes.io/projected/d1922597-28ff-467e-96f9-a3557c298089-kube-api-access-nw4t9\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:18.560475 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.560479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:18.661872 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.661839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:18.662038 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.661921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4t9\" (UniqueName: \"kubernetes.io/projected/d1922597-28ff-467e-96f9-a3557c298089-kube-api-access-nw4t9\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:18.662038 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:18.661979 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 13:18:18.662140 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:18.662041 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert podName:d1922597-28ff-467e-96f9-a3557c298089 nodeName:}" failed. No retries permitted until 2026-04-16 13:18:19.162024814 +0000 UTC m=+413.751401880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert") pod "odh-model-controller-858dbf95b8-7mwdl" (UID: "d1922597-28ff-467e-96f9-a3557c298089") : secret "odh-model-controller-webhook-cert" not found Apr 16 13:18:18.669838 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:18.669807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4t9\" (UniqueName: \"kubernetes.io/projected/d1922597-28ff-467e-96f9-a3557c298089-kube-api-access-nw4t9\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:19.165597 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:19.165560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:19.165781 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:19.165696 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 13:18:19.165781 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:19.165753 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert podName:d1922597-28ff-467e-96f9-a3557c298089 nodeName:}" failed. No retries permitted until 2026-04-16 13:18:20.165738294 +0000 UTC m=+414.755115360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert") pod "odh-model-controller-858dbf95b8-7mwdl" (UID: "d1922597-28ff-467e-96f9-a3557c298089") : secret "odh-model-controller-webhook-cert" not found Apr 16 13:18:20.174326 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:20.174280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:20.176841 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:20.176813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1922597-28ff-467e-96f9-a3557c298089-cert\") pod \"odh-model-controller-858dbf95b8-7mwdl\" (UID: \"d1922597-28ff-467e-96f9-a3557c298089\") " pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:20.318566 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:20.318524 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:20.444561 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:20.444355 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7mwdl"] Apr 16 13:18:20.447006 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:18:20.446977 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1922597_28ff_467e_96f9_a3557c298089.slice/crio-7eaa31e51df3e867e125de04050c004d8f80131b14d0f9ffbabc44b332c7d0ec WatchSource:0}: Error finding container 7eaa31e51df3e867e125de04050c004d8f80131b14d0f9ffbabc44b332c7d0ec: Status 404 returned error can't find the container with id 7eaa31e51df3e867e125de04050c004d8f80131b14d0f9ffbabc44b332c7d0ec Apr 16 13:18:21.220112 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:21.220072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" event={"ID":"d1922597-28ff-467e-96f9-a3557c298089","Type":"ContainerStarted","Data":"7eaa31e51df3e867e125de04050c004d8f80131b14d0f9ffbabc44b332c7d0ec"} Apr 16 13:18:24.169878 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.169843 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-dmcnl"] Apr 16 13:18:24.171967 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.171946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.173805 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.173785 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 13:18:24.173920 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.173831 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-jl7p9\"" Apr 16 13:18:24.180549 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.180499 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-dmcnl"] Apr 16 13:18:24.205648 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.205620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtf87\" (UniqueName: \"kubernetes.io/projected/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-kube-api-access-qtf87\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.205793 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.205686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-cert\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.231135 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.231083 2575 generic.go:358] "Generic (PLEG): container finished" podID="d1922597-28ff-467e-96f9-a3557c298089" containerID="a245cb75b1493a628922bdf21c63b0505a968f269460deef750c3bfecb561b15" exitCode=1 Apr 16 13:18:24.231261 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.231169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" event={"ID":"d1922597-28ff-467e-96f9-a3557c298089","Type":"ContainerDied","Data":"a245cb75b1493a628922bdf21c63b0505a968f269460deef750c3bfecb561b15"} Apr 16 13:18:24.231402 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.231389 2575 scope.go:117] "RemoveContainer" containerID="a245cb75b1493a628922bdf21c63b0505a968f269460deef750c3bfecb561b15" Apr 16 13:18:24.306514 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.306458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-cert\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.306709 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.306541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtf87\" (UniqueName: \"kubernetes.io/projected/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-kube-api-access-qtf87\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.306887 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:24.306860 2575 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 13:18:24.306962 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:24.306945 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-cert podName:4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b nodeName:}" failed. No retries permitted until 2026-04-16 13:18:24.806923882 +0000 UTC m=+419.396300952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-cert") pod "kserve-controller-manager-856948b99f-dmcnl" (UID: "4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b") : secret "kserve-webhook-server-cert" not found Apr 16 13:18:24.316171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.316143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtf87\" (UniqueName: \"kubernetes.io/projected/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-kube-api-access-qtf87\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.810629 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.810542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-cert\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:24.812889 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:24.812869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b-cert\") pod \"kserve-controller-manager-856948b99f-dmcnl\" (UID: \"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b\") " pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:25.083491 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.083382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:25.203564 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.203534 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-dmcnl"] Apr 16 13:18:25.206710 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:18:25.206689 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a0e0fd6_c6eb_4ce0_85c4_d5d7c07cc54b.slice/crio-95660f9863367cc757f7bb88620c024d0bd77f941569e57c8831def8778fcfa4 WatchSource:0}: Error finding container 95660f9863367cc757f7bb88620c024d0bd77f941569e57c8831def8778fcfa4: Status 404 returned error can't find the container with id 95660f9863367cc757f7bb88620c024d0bd77f941569e57c8831def8778fcfa4 Apr 16 13:18:25.235380 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.235348 2575 generic.go:358] "Generic (PLEG): container finished" podID="d1922597-28ff-467e-96f9-a3557c298089" containerID="df00d51d9660915ded3c6bc49184fe20ebb6304aa579531b427604c065ff5392" exitCode=1 Apr 16 13:18:25.235533 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.235409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" event={"ID":"d1922597-28ff-467e-96f9-a3557c298089","Type":"ContainerDied","Data":"df00d51d9660915ded3c6bc49184fe20ebb6304aa579531b427604c065ff5392"} Apr 16 13:18:25.235533 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.235437 2575 scope.go:117] "RemoveContainer" containerID="a245cb75b1493a628922bdf21c63b0505a968f269460deef750c3bfecb561b15" Apr 16 13:18:25.235689 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.235673 2575 scope.go:117] "RemoveContainer" containerID="df00d51d9660915ded3c6bc49184fe20ebb6304aa579531b427604c065ff5392" Apr 16 13:18:25.235919 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:25.235890 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7mwdl_opendatahub(d1922597-28ff-467e-96f9-a3557c298089)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" podUID="d1922597-28ff-467e-96f9-a3557c298089" Apr 16 13:18:25.236621 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:25.236589 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" event={"ID":"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b","Type":"ContainerStarted","Data":"95660f9863367cc757f7bb88620c024d0bd77f941569e57c8831def8778fcfa4"} Apr 16 13:18:26.241284 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:26.241258 2575 scope.go:117] "RemoveContainer" containerID="df00d51d9660915ded3c6bc49184fe20ebb6304aa579531b427604c065ff5392" Apr 16 13:18:26.241721 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:26.241557 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7mwdl_opendatahub(d1922597-28ff-467e-96f9-a3557c298089)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" podUID="d1922597-28ff-467e-96f9-a3557c298089" Apr 16 13:18:28.249940 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.249903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" event={"ID":"4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b","Type":"ContainerStarted","Data":"015d8d011feb42f3616789ad9907a02caafeed5d6c93eceff3f91b6b1fe7710b"} Apr 16 13:18:28.250391 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.249967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:18:28.275681 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.273329 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" podStartSLOduration=1.5839456360000002 podStartE2EDuration="4.273302826s" podCreationTimestamp="2026-04-16 13:18:24 +0000 UTC" firstStartedPulling="2026-04-16 13:18:25.208036321 +0000 UTC m=+419.797413387" lastFinishedPulling="2026-04-16 13:18:27.897393507 +0000 UTC m=+422.486770577" observedRunningTime="2026-04-16 13:18:28.270515752 +0000 UTC m=+422.859892839" watchObservedRunningTime="2026-04-16 13:18:28.273302826 +0000 UTC m=+422.862679912" Apr 16 13:18:28.838064 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.838031 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nvswr"] Apr 16 13:18:28.840945 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.840926 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:28.842991 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.842969 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 13:18:28.843105 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.842969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-vl98w\"" Apr 16 13:18:28.843295 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.843281 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 13:18:28.854623 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.854595 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nvswr"] Apr 16 13:18:28.942668 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.942623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e0a7adad-2440-481a-b480-1a15c2ebc238-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nvswr\" (UID: \"e0a7adad-2440-481a-b480-1a15c2ebc238\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:28.942848 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:28.942771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgz5\" (UniqueName: \"kubernetes.io/projected/e0a7adad-2440-481a-b480-1a15c2ebc238-kube-api-access-hmgz5\") pod \"servicemesh-operator3-55f49c5f94-nvswr\" (UID: \"e0a7adad-2440-481a-b480-1a15c2ebc238\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:29.043499 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:29.043456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgz5\" (UniqueName: \"kubernetes.io/projected/e0a7adad-2440-481a-b480-1a15c2ebc238-kube-api-access-hmgz5\") pod \"servicemesh-operator3-55f49c5f94-nvswr\" (UID: \"e0a7adad-2440-481a-b480-1a15c2ebc238\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:29.043683 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:29.043510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e0a7adad-2440-481a-b480-1a15c2ebc238-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nvswr\" (UID: \"e0a7adad-2440-481a-b480-1a15c2ebc238\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:29.046026 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:29.046000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e0a7adad-2440-481a-b480-1a15c2ebc238-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nvswr\" (UID: \"e0a7adad-2440-481a-b480-1a15c2ebc238\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:29.053665 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:29.053608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgz5\" (UniqueName: \"kubernetes.io/projected/e0a7adad-2440-481a-b480-1a15c2ebc238-kube-api-access-hmgz5\") pod \"servicemesh-operator3-55f49c5f94-nvswr\" (UID: \"e0a7adad-2440-481a-b480-1a15c2ebc238\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:29.150437 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:29.150349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:29.278223 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:29.278190 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nvswr"] Apr 16 13:18:29.281556 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:18:29.281522 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a7adad_2440_481a_b480_1a15c2ebc238.slice/crio-c88afea8784b2881cae55b9e1af2a06d17fdaae38d985f683b2840868135700c WatchSource:0}: Error finding container c88afea8784b2881cae55b9e1af2a06d17fdaae38d985f683b2840868135700c: Status 404 returned error can't find the container with id c88afea8784b2881cae55b9e1af2a06d17fdaae38d985f683b2840868135700c Apr 16 13:18:30.256888 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:30.256854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" event={"ID":"e0a7adad-2440-481a-b480-1a15c2ebc238","Type":"ContainerStarted","Data":"c88afea8784b2881cae55b9e1af2a06d17fdaae38d985f683b2840868135700c"} Apr 16 13:18:30.319014 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:30.318979 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:30.319390 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:30.319381 2575 scope.go:117] "RemoveContainer" containerID="df00d51d9660915ded3c6bc49184fe20ebb6304aa579531b427604c065ff5392" Apr 16 13:18:30.319568 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:18:30.319550 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7mwdl_opendatahub(d1922597-28ff-467e-96f9-a3557c298089)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" podUID="d1922597-28ff-467e-96f9-a3557c298089" Apr 16 13:18:33.268065 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:33.268035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" event={"ID":"e0a7adad-2440-481a-b480-1a15c2ebc238","Type":"ContainerStarted","Data":"9b38de812f2b9c1b21442e9ef67782ebbbec109bf505211187683d67290b556b"} Apr 16 13:18:33.268530 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:33.268184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:33.287175 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:33.287109 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" podStartSLOduration=1.7493186170000001 podStartE2EDuration="5.287094904s" podCreationTimestamp="2026-04-16 13:18:28 +0000 UTC" firstStartedPulling="2026-04-16 13:18:29.284147275 +0000 UTC m=+423.873524341" lastFinishedPulling="2026-04-16 13:18:32.821923563 +0000 UTC m=+427.411300628" observedRunningTime="2026-04-16 13:18:33.285816624 +0000 UTC m=+427.875193722" watchObservedRunningTime="2026-04-16 13:18:33.287094904 +0000 UTC m=+427.876471991" Apr 16 13:18:34.023505 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.023469 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f"] Apr 16 13:18:34.026563 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.026548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.028416 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.028389 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 13:18:34.028416 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.028405 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 13:18:34.028621 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.028392 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 13:18:34.028692 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.028678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 13:18:34.028812 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.028793 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-9vtnh\"" Apr 16 13:18:34.038262 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.038240 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f"] Apr 16 13:18:34.089439 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.089587 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.089587 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/57308b09-bbb5-4479-bffd-7c974dfe32af-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.089587 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.089696 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.089696 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.089696 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.089683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwxkp\" (UniqueName: \"kubernetes.io/projected/57308b09-bbb5-4479-bffd-7c974dfe32af-kube-api-access-rwxkp\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190114 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190312 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190312 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190312 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwxkp\" (UniqueName: \"kubernetes.io/projected/57308b09-bbb5-4479-bffd-7c974dfe32af-kube-api-access-rwxkp\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190312 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190312 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190558 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/57308b09-bbb5-4479-bffd-7c974dfe32af-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.190799 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.190771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.193013 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.192968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.193130 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.193072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.193484 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.193465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/57308b09-bbb5-4479-bffd-7c974dfe32af-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.193608 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.193591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/57308b09-bbb5-4479-bffd-7c974dfe32af-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.203273 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.203207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/57308b09-bbb5-4479-bffd-7c974dfe32af-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.203273 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.203221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwxkp\" (UniqueName: \"kubernetes.io/projected/57308b09-bbb5-4479-bffd-7c974dfe32af-kube-api-access-rwxkp\") pod \"istiod-openshift-gateway-55ff986f96-g986f\" (UID: \"57308b09-bbb5-4479-bffd-7c974dfe32af\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.335995 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.335897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:34.456731 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:34.456707 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f"] Apr 16 13:18:34.459110 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:18:34.459078 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57308b09_bbb5_4479_bffd_7c974dfe32af.slice/crio-cdd9e8e72d84ce61a869188425ada8544d780267cb55326d6caf1bcc30319e79 WatchSource:0}: Error finding container cdd9e8e72d84ce61a869188425ada8544d780267cb55326d6caf1bcc30319e79: Status 404 returned error can't find the container with id cdd9e8e72d84ce61a869188425ada8544d780267cb55326d6caf1bcc30319e79 Apr 16 13:18:35.276825 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:35.276792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" event={"ID":"57308b09-bbb5-4479-bffd-7c974dfe32af","Type":"ContainerStarted","Data":"cdd9e8e72d84ce61a869188425ada8544d780267cb55326d6caf1bcc30319e79"} Apr 16 13:18:36.820738 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:36.820697 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 13:18:36.821185 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:36.820766 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 13:18:37.287525 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:37.287490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" event={"ID":"57308b09-bbb5-4479-bffd-7c974dfe32af","Type":"ContainerStarted","Data":"b1624e51abed94b43b5ac9d9842a68b1861860f4c62195c81b66000ac7f3a8f6"} Apr 16 13:18:37.287728 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:37.287696 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:37.289395 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:37.289373 2575 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-g986f container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 13:18:37.289501 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:37.289418 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" podUID="57308b09-bbb5-4479-bffd-7c974dfe32af" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 13:18:37.305805 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:37.305763 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" podStartSLOduration=0.946851526 podStartE2EDuration="3.305748655s" podCreationTimestamp="2026-04-16 13:18:34 +0000 UTC" firstStartedPulling="2026-04-16 13:18:34.461578284 +0000 UTC m=+429.050955353" lastFinishedPulling="2026-04-16 13:18:36.820475403 +0000 UTC m=+431.409852482" observedRunningTime="2026-04-16 13:18:37.304549308 +0000 UTC m=+431.893926415" watchObservedRunningTime="2026-04-16 13:18:37.305748655 +0000 UTC m=+431.895125742" Apr 16 13:18:38.291557 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:38.291486 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-g986f" Apr 16 13:18:40.318756 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:40.318721 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:40.319178 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:40.319162 2575 scope.go:117] "RemoveContainer" containerID="df00d51d9660915ded3c6bc49184fe20ebb6304aa579531b427604c065ff5392" Apr 16 13:18:41.302504 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:41.302462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" event={"ID":"d1922597-28ff-467e-96f9-a3557c298089","Type":"ContainerStarted","Data":"049ae8dd9f5368cd0d274f51516c158bb8a24493f5928c8ed1c3baf2a15bb901"} Apr 16 13:18:41.302687 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:41.302669 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:41.318683 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:41.318637 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" podStartSLOduration=3.172908696 podStartE2EDuration="23.318622897s" podCreationTimestamp="2026-04-16 13:18:18 +0000 UTC" firstStartedPulling="2026-04-16 13:18:20.448540721 +0000 UTC m=+415.037917802" lastFinishedPulling="2026-04-16 13:18:40.594254937 +0000 UTC m=+435.183632003" observedRunningTime="2026-04-16 13:18:41.316844563 +0000 UTC m=+435.906221660" watchObservedRunningTime="2026-04-16 13:18:41.318622897 +0000 UTC m=+435.907999984" Apr 16 13:18:44.274924 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:44.274888 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nvswr" Apr 16 13:18:52.307527 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:52.307490 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-7mwdl" Apr 16 13:18:59.258378 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:18:59.258348 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-dmcnl" Apr 16 13:19:50.832707 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.832665 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4hhfl"] Apr 16 13:19:50.834883 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.834860 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:19:50.836942 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.836923 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 13:19:50.837579 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.837564 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 13:19:50.837782 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.837768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-q5l9f\"" Apr 16 13:19:50.852278 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.852259 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4hhfl"] Apr 16 13:19:50.870807 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.870779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8xh\" (UniqueName: \"kubernetes.io/projected/1f1a3347-d911-4625-a33f-801bd0705e43-kube-api-access-ml8xh\") pod \"authorino-operator-657f44b778-4hhfl\" (UID: \"1f1a3347-d911-4625-a33f-801bd0705e43\") " pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:19:50.972115 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.972077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8xh\" (UniqueName: \"kubernetes.io/projected/1f1a3347-d911-4625-a33f-801bd0705e43-kube-api-access-ml8xh\") pod \"authorino-operator-657f44b778-4hhfl\" (UID: \"1f1a3347-d911-4625-a33f-801bd0705e43\") " pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:19:50.983836 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:50.983812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8xh\" (UniqueName: \"kubernetes.io/projected/1f1a3347-d911-4625-a33f-801bd0705e43-kube-api-access-ml8xh\") pod \"authorino-operator-657f44b778-4hhfl\" (UID: \"1f1a3347-d911-4625-a33f-801bd0705e43\") " pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:19:51.144657 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:51.144555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:19:51.275414 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:51.275358 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4hhfl"] Apr 16 13:19:51.279216 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:19:51.279185 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f1a3347_d911_4625_a33f_801bd0705e43.slice/crio-c8c44f7c42de07ce828ba03a5db55fdcb3e16a3fc61e4ba414ccc17b13401f71 WatchSource:0}: Error finding container c8c44f7c42de07ce828ba03a5db55fdcb3e16a3fc61e4ba414ccc17b13401f71: Status 404 returned error can't find the container with id c8c44f7c42de07ce828ba03a5db55fdcb3e16a3fc61e4ba414ccc17b13401f71 Apr 16 13:19:51.524972 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:51.524936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" event={"ID":"1f1a3347-d911-4625-a33f-801bd0705e43","Type":"ContainerStarted","Data":"c8c44f7c42de07ce828ba03a5db55fdcb3e16a3fc61e4ba414ccc17b13401f71"} Apr 16 13:19:53.532764 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:53.532717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" event={"ID":"1f1a3347-d911-4625-a33f-801bd0705e43","Type":"ContainerStarted","Data":"55d5946db2b98b8528ae3bfbce4e44ef33988c3b5735fb182ca337685471ce5c"} Apr 16 13:19:53.533235 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:53.532833 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:19:53.549781 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:19:53.549731 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" podStartSLOduration=2.108381137 podStartE2EDuration="3.549717988s" podCreationTimestamp="2026-04-16 13:19:50 +0000 UTC" firstStartedPulling="2026-04-16 13:19:51.281076756 +0000 UTC m=+505.870453822" lastFinishedPulling="2026-04-16 13:19:52.722413604 +0000 UTC m=+507.311790673" observedRunningTime="2026-04-16 13:19:53.548365793 +0000 UTC m=+508.137742880" watchObservedRunningTime="2026-04-16 13:19:53.549717988 +0000 UTC m=+508.139095103" Apr 16 13:20:04.538570 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:04.538538 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-4hhfl" Apr 16 13:20:40.315234 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.315196 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:40.317302 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.317286 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.319234 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.319212 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b5hqc\"" Apr 16 13:20:40.319349 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.319213 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 13:20:40.326787 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.326763 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:40.413059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.413026 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:40.458560 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.458527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p7wg\" (UniqueName: \"kubernetes.io/projected/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-kube-api-access-4p7wg\") pod \"limitador-limitador-7d549b5b-x6r5v\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.458729 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.458568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-config-file\") pod \"limitador-limitador-7d549b5b-x6r5v\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.559608 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.559575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p7wg\" (UniqueName: \"kubernetes.io/projected/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-kube-api-access-4p7wg\") pod \"limitador-limitador-7d549b5b-x6r5v\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.559799 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.559619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-config-file\") pod \"limitador-limitador-7d549b5b-x6r5v\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.560267 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.560247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-config-file\") pod \"limitador-limitador-7d549b5b-x6r5v\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.566925 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.566842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p7wg\" (UniqueName: \"kubernetes.io/projected/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-kube-api-access-4p7wg\") pod \"limitador-limitador-7d549b5b-x6r5v\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.627849 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.627813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:40.752670 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:40.752640 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:40.756197 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:20:40.756160 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a076cd_10cf_42ef_ad5b_7a4df90ed95d.slice/crio-6fd84e9a9c7158b82e426d29b917c151e72cae89fb895f8b3493234ca22581a5 WatchSource:0}: Error finding container 6fd84e9a9c7158b82e426d29b917c151e72cae89fb895f8b3493234ca22581a5: Status 404 returned error can't find the container with id 6fd84e9a9c7158b82e426d29b917c151e72cae89fb895f8b3493234ca22581a5 Apr 16 13:20:41.688022 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:41.687980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" event={"ID":"91a076cd-10cf-42ef-ad5b-7a4df90ed95d","Type":"ContainerStarted","Data":"6fd84e9a9c7158b82e426d29b917c151e72cae89fb895f8b3493234ca22581a5"} Apr 16 13:20:43.695966 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:43.695925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" event={"ID":"91a076cd-10cf-42ef-ad5b-7a4df90ed95d","Type":"ContainerStarted","Data":"2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd"} Apr 16 13:20:43.696458 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:43.696028 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:43.712751 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:43.712696 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" podStartSLOduration=0.996237403 podStartE2EDuration="3.712679466s" podCreationTimestamp="2026-04-16 13:20:40 +0000 UTC" firstStartedPulling="2026-04-16 13:20:40.757870758 +0000 UTC m=+555.347247824" lastFinishedPulling="2026-04-16 13:20:43.474312807 +0000 UTC m=+558.063689887" observedRunningTime="2026-04-16 13:20:43.710439423 +0000 UTC m=+558.299816527" watchObservedRunningTime="2026-04-16 13:20:43.712679466 +0000 UTC m=+558.302056600" Apr 16 13:20:54.700193 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:54.700157 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:55.595716 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:55.595676 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:55.595992 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:55.595944 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" podUID="91a076cd-10cf-42ef-ad5b-7a4df90ed95d" containerName="limitador" containerID="cri-o://2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd" gracePeriod=30 Apr 16 13:20:56.537883 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.537862 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:56.581681 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.581655 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-config-file\") pod \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " Apr 16 13:20:56.581829 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.581691 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p7wg\" (UniqueName: \"kubernetes.io/projected/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-kube-api-access-4p7wg\") pod \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\" (UID: \"91a076cd-10cf-42ef-ad5b-7a4df90ed95d\") " Apr 16 13:20:56.582025 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.582004 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-config-file" (OuterVolumeSpecName: "config-file") pod "91a076cd-10cf-42ef-ad5b-7a4df90ed95d" (UID: "91a076cd-10cf-42ef-ad5b-7a4df90ed95d"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:20:56.583891 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.583868 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-kube-api-access-4p7wg" (OuterVolumeSpecName: "kube-api-access-4p7wg") pod "91a076cd-10cf-42ef-ad5b-7a4df90ed95d" (UID: "91a076cd-10cf-42ef-ad5b-7a4df90ed95d"). InnerVolumeSpecName "kube-api-access-4p7wg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:20:56.683192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.683092 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-config-file\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:20:56.683192 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.683153 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4p7wg\" (UniqueName: \"kubernetes.io/projected/91a076cd-10cf-42ef-ad5b-7a4df90ed95d-kube-api-access-4p7wg\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 16 13:20:56.736363 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.736331 2575 generic.go:358] "Generic (PLEG): container finished" podID="91a076cd-10cf-42ef-ad5b-7a4df90ed95d" containerID="2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd" exitCode=0 Apr 16 13:20:56.736530 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.736405 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" Apr 16 13:20:56.736530 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.736418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" event={"ID":"91a076cd-10cf-42ef-ad5b-7a4df90ed95d","Type":"ContainerDied","Data":"2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd"} Apr 16 13:20:56.736530 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.736462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-x6r5v" event={"ID":"91a076cd-10cf-42ef-ad5b-7a4df90ed95d","Type":"ContainerDied","Data":"6fd84e9a9c7158b82e426d29b917c151e72cae89fb895f8b3493234ca22581a5"} Apr 16 13:20:56.736530 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.736479 2575 scope.go:117] "RemoveContainer" containerID="2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd" Apr 16 13:20:56.744976 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.744959 2575 scope.go:117] "RemoveContainer" containerID="2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd" Apr 16 13:20:56.745282 ip-10-0-142-166 kubenswrapper[2575]: E0416 13:20:56.745264 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd\": container with ID starting with 2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd not found: ID does not exist" containerID="2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd" Apr 16 13:20:56.745337 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.745291 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd"} err="failed to get container status \"2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd\": rpc error: code = NotFound desc = could not find container \"2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd\": container with ID starting with 2decbfd8fb9bcf407ba1728ca340aceb017efbd4eaabe7ddfa18e3c570478edd not found: ID does not exist" Apr 16 13:20:56.756864 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.756835 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:56.760684 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:56.760661 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-x6r5v"] Apr 16 13:20:58.046852 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:20:58.046818 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a076cd-10cf-42ef-ad5b-7a4df90ed95d" path="/var/lib/kubelet/pods/91a076cd-10cf-42ef-ad5b-7a4df90ed95d/volumes" Apr 16 13:21:01.541940 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.541904 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-st9jq"] Apr 16 13:21:01.542331 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.542190 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a076cd-10cf-42ef-ad5b-7a4df90ed95d" containerName="limitador" Apr 16 13:21:01.542331 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.542201 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a076cd-10cf-42ef-ad5b-7a4df90ed95d" containerName="limitador" Apr 16 13:21:01.542331 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.542253 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a076cd-10cf-42ef-ad5b-7a4df90ed95d" containerName="limitador" Apr 16 13:21:01.545163 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.545141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.547237 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.547207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-z92px\"" Apr 16 13:21:01.547363 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.547208 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 13:21:01.552044 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.552015 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-st9jq"] Apr 16 13:21:01.618185 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.618148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45zn\" (UniqueName: \"kubernetes.io/projected/ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3-kube-api-access-x45zn\") pod \"postgres-868db5846d-st9jq\" (UID: \"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3\") " pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.618340 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.618197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3-data\") pod \"postgres-868db5846d-st9jq\" (UID: \"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3\") " pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.719365 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.719330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3-data\") pod \"postgres-868db5846d-st9jq\" (UID: \"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3\") " pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.719562 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.719430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x45zn\" (UniqueName: \"kubernetes.io/projected/ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3-kube-api-access-x45zn\") pod \"postgres-868db5846d-st9jq\" (UID: \"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3\") " pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.719780 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.719759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3-data\") pod \"postgres-868db5846d-st9jq\" (UID: \"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3\") " pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.730280 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.730247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45zn\" (UniqueName: \"kubernetes.io/projected/ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3-kube-api-access-x45zn\") pod \"postgres-868db5846d-st9jq\" (UID: \"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3\") " pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.857938 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.857805 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:01.975512 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:01.975487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-st9jq"] Apr 16 13:21:01.977795 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:21:01.977763 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea62fa2b_68c7_4401_8b30_b85e7ebaa0c3.slice/crio-927f82602ab6742ec27277b5aa1cf2ca392195b3c27561c5f81e15cf672edee3 WatchSource:0}: Error finding container 927f82602ab6742ec27277b5aa1cf2ca392195b3c27561c5f81e15cf672edee3: Status 404 returned error can't find the container with id 927f82602ab6742ec27277b5aa1cf2ca392195b3c27561c5f81e15cf672edee3 Apr 16 13:21:02.756376 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:02.756341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-st9jq" event={"ID":"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3","Type":"ContainerStarted","Data":"927f82602ab6742ec27277b5aa1cf2ca392195b3c27561c5f81e15cf672edee3"} Apr 16 13:21:07.776799 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:07.776767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-st9jq" event={"ID":"ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3","Type":"ContainerStarted","Data":"a2d0a29e86fb6799c5c0c5ba3d03aa2b440a04a72f41dc4f61c649bce4241be6"} Apr 16 13:21:07.777223 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:07.776902 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:07.792601 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:07.792544 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-st9jq" podStartSLOduration=1.871357003 podStartE2EDuration="6.792529336s" podCreationTimestamp="2026-04-16 13:21:01 +0000 UTC" firstStartedPulling="2026-04-16 13:21:01.979195127 +0000 UTC m=+576.568572206" lastFinishedPulling="2026-04-16 13:21:06.900367469 +0000 UTC m=+581.489744539" observedRunningTime="2026-04-16 13:21:07.791065635 +0000 UTC m=+582.380442745" watchObservedRunningTime="2026-04-16 13:21:07.792529336 +0000 UTC m=+582.381906424" Apr 16 13:21:13.807778 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:13.807751 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-st9jq" Apr 16 13:21:22.779325 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.779285 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-8r2s6"] Apr 16 13:21:22.806997 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.806964 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-8r2s6"] Apr 16 13:21:22.807166 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.807098 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" Apr 16 13:21:22.809416 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.809396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 16 13:21:22.810137 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.810064 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 16 13:21:22.810137 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.810073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-qblkf\"" Apr 16 13:21:22.900911 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:22.900874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qs7\" (UniqueName: \"kubernetes.io/projected/925e8436-a8f6-4a40-bef9-19161d9bfbe9-kube-api-access-m5qs7\") pod \"keycloak-operator-5c4df598dd-8r2s6\" (UID: \"925e8436-a8f6-4a40-bef9-19161d9bfbe9\") " pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" Apr 16 13:21:23.001673 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:23.001639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qs7\" (UniqueName: \"kubernetes.io/projected/925e8436-a8f6-4a40-bef9-19161d9bfbe9-kube-api-access-m5qs7\") pod \"keycloak-operator-5c4df598dd-8r2s6\" (UID: \"925e8436-a8f6-4a40-bef9-19161d9bfbe9\") " pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" Apr 16 13:21:23.009714 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:23.009690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qs7\" (UniqueName: \"kubernetes.io/projected/925e8436-a8f6-4a40-bef9-19161d9bfbe9-kube-api-access-m5qs7\") pod \"keycloak-operator-5c4df598dd-8r2s6\" (UID: \"925e8436-a8f6-4a40-bef9-19161d9bfbe9\") " pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" Apr 16 13:21:23.117706 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:23.117616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" Apr 16 13:21:23.236006 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:23.235968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-8r2s6"] Apr 16 13:21:23.238325 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:21:23.238298 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925e8436_a8f6_4a40_bef9_19161d9bfbe9.slice/crio-f3a2431d7f5330cabd871cb1822673b7afd3c24f9ce46fda56c5f6491f8b0fe8 WatchSource:0}: Error finding container f3a2431d7f5330cabd871cb1822673b7afd3c24f9ce46fda56c5f6491f8b0fe8: Status 404 returned error can't find the container with id f3a2431d7f5330cabd871cb1822673b7afd3c24f9ce46fda56c5f6491f8b0fe8 Apr 16 13:21:23.829621 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:23.829565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" event={"ID":"925e8436-a8f6-4a40-bef9-19161d9bfbe9","Type":"ContainerStarted","Data":"f3a2431d7f5330cabd871cb1822673b7afd3c24f9ce46fda56c5f6491f8b0fe8"} Apr 16 13:21:25.947334 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:25.947304 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:21:25.947721 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:25.947571 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:21:28.847175 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:28.847145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" event={"ID":"925e8436-a8f6-4a40-bef9-19161d9bfbe9","Type":"ContainerStarted","Data":"c53435a5886757bd42269035c62895e9e3a55a339b521ee76fd5b1688e1863e9"} Apr 16 13:21:29.866673 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:21:29.866617 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-8r2s6" podStartSLOduration=2.338103335 podStartE2EDuration="7.866598567s" podCreationTimestamp="2026-04-16 13:21:22 +0000 UTC" firstStartedPulling="2026-04-16 13:21:23.239694759 +0000 UTC m=+597.829071828" lastFinishedPulling="2026-04-16 13:21:28.76818998 +0000 UTC m=+603.357567060" observedRunningTime="2026-04-16 13:21:29.865541023 +0000 UTC m=+604.454918111" watchObservedRunningTime="2026-04-16 13:21:29.866598567 +0000 UTC m=+604.455975657" Apr 16 13:22:08.596387 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.596337 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7565bd457c-8x8ds"] Apr 16 13:22:08.599788 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.599766 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.602039 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.602012 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 13:22:08.602171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.602014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 13:22:08.602171 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.602016 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-dkfvz\"" Apr 16 13:22:08.605806 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.605782 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7565bd457c-8x8ds"] Apr 16 13:22:08.676225 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.676195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75188ba9-7fa3-4368-b24a-d60610f84e4e-maas-api-tls\") pod \"maas-api-7565bd457c-8x8ds\" (UID: \"75188ba9-7fa3-4368-b24a-d60610f84e4e\") " pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.676392 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.676234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6jj\" (UniqueName: \"kubernetes.io/projected/75188ba9-7fa3-4368-b24a-d60610f84e4e-kube-api-access-7c6jj\") pod \"maas-api-7565bd457c-8x8ds\" (UID: \"75188ba9-7fa3-4368-b24a-d60610f84e4e\") " pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.777641 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.777608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75188ba9-7fa3-4368-b24a-d60610f84e4e-maas-api-tls\") pod \"maas-api-7565bd457c-8x8ds\" (UID: \"75188ba9-7fa3-4368-b24a-d60610f84e4e\") " pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.777797 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.777668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6jj\" (UniqueName: \"kubernetes.io/projected/75188ba9-7fa3-4368-b24a-d60610f84e4e-kube-api-access-7c6jj\") pod \"maas-api-7565bd457c-8x8ds\" (UID: \"75188ba9-7fa3-4368-b24a-d60610f84e4e\") " pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.780073 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.780043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75188ba9-7fa3-4368-b24a-d60610f84e4e-maas-api-tls\") pod \"maas-api-7565bd457c-8x8ds\" (UID: \"75188ba9-7fa3-4368-b24a-d60610f84e4e\") " pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.785158 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.785132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6jj\" (UniqueName: \"kubernetes.io/projected/75188ba9-7fa3-4368-b24a-d60610f84e4e-kube-api-access-7c6jj\") pod \"maas-api-7565bd457c-8x8ds\" (UID: \"75188ba9-7fa3-4368-b24a-d60610f84e4e\") " pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:08.911729 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:08.911643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:09.032234 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:09.032209 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7565bd457c-8x8ds"] Apr 16 13:22:09.034644 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:22:09.034616 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75188ba9_7fa3_4368_b24a_d60610f84e4e.slice/crio-4003e7bc099c42164de1f20730b0b19ce17fef204167c5b94d0478c02a9e0917 WatchSource:0}: Error finding container 4003e7bc099c42164de1f20730b0b19ce17fef204167c5b94d0478c02a9e0917: Status 404 returned error can't find the container with id 4003e7bc099c42164de1f20730b0b19ce17fef204167c5b94d0478c02a9e0917 Apr 16 13:22:09.986326 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:09.986292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7565bd457c-8x8ds" event={"ID":"75188ba9-7fa3-4368-b24a-d60610f84e4e","Type":"ContainerStarted","Data":"4003e7bc099c42164de1f20730b0b19ce17fef204167c5b94d0478c02a9e0917"} Apr 16 13:22:11.994535 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:11.994497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7565bd457c-8x8ds" event={"ID":"75188ba9-7fa3-4368-b24a-d60610f84e4e","Type":"ContainerStarted","Data":"380c71d167ae0702b001c5de1bc5fe9f6cc1153665cd3f145fa3af6a0827f9d0"} Apr 16 13:22:11.994928 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:11.994615 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:22:12.014495 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:12.014443 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7565bd457c-8x8ds" podStartSLOduration=2.001757465 podStartE2EDuration="4.014427895s" podCreationTimestamp="2026-04-16 13:22:08 +0000 UTC" firstStartedPulling="2026-04-16 13:22:09.035951805 +0000 UTC m=+643.625328871" lastFinishedPulling="2026-04-16 13:22:11.04862222 +0000 UTC m=+645.637999301" observedRunningTime="2026-04-16 13:22:12.012333238 +0000 UTC m=+646.601710346" watchObservedRunningTime="2026-04-16 13:22:12.014427895 +0000 UTC m=+646.603804982" Apr 16 13:22:18.003791 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:22:18.003764 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7565bd457c-8x8ds" Apr 16 13:23:39.374529 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.374492 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc"] Apr 16 13:23:39.377860 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.377838 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.380306 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.380279 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 13:23:39.380306 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.380301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 13:23:39.380454 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.380400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 13:23:39.380502 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.380447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-br65d\"" Apr 16 13:23:39.385463 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.385422 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc"] Apr 16 13:23:39.492729 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.492688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.492729 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.492730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.492951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.492753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hkhn\" (UniqueName: \"kubernetes.io/projected/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-kube-api-access-7hkhn\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.492951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.492784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.492951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.492800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.492951 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.492819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.593702 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.593659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.593702 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.593701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.593950 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.593803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hkhn\" (UniqueName: \"kubernetes.io/projected/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-kube-api-access-7hkhn\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.593950 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.593824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.593950 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.593840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.593950 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.593860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.594267 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.594242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.594352 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.594309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.594352 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.594343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.596083 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.596064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.596418 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.596397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.611880 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.611844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hkhn\" (UniqueName: \"kubernetes.io/projected/3e9d7a72-97a6-44c1-98b6-44b534c54d6a-kube-api-access-7hkhn\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-66zqc\" (UID: \"3e9d7a72-97a6-44c1-98b6-44b534c54d6a\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.688912 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.688824 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:39.814330 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.814303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc"] Apr 16 13:23:39.816868 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:23:39.816830 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e9d7a72_97a6_44c1_98b6_44b534c54d6a.slice/crio-6c2bb61ec46a20d10a121d475f29f3b426345bccf02da3596de1cb45d2663ed3 WatchSource:0}: Error finding container 6c2bb61ec46a20d10a121d475f29f3b426345bccf02da3596de1cb45d2663ed3: Status 404 returned error can't find the container with id 6c2bb61ec46a20d10a121d475f29f3b426345bccf02da3596de1cb45d2663ed3 Apr 16 13:23:39.818560 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:39.818543 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:23:40.270346 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:40.270310 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" event={"ID":"3e9d7a72-97a6-44c1-98b6-44b534c54d6a","Type":"ContainerStarted","Data":"6c2bb61ec46a20d10a121d475f29f3b426345bccf02da3596de1cb45d2663ed3"} Apr 16 13:23:45.291141 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:45.291087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" event={"ID":"3e9d7a72-97a6-44c1-98b6-44b534c54d6a","Type":"ContainerStarted","Data":"f275b890118a9f02cb366f201f3701817d1b85e87be5e5569462952aa0c235d6"} Apr 16 13:23:51.312216 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:51.312178 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e9d7a72-97a6-44c1-98b6-44b534c54d6a" containerID="f275b890118a9f02cb366f201f3701817d1b85e87be5e5569462952aa0c235d6" exitCode=0 Apr 16 13:23:51.312585 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:51.312253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" event={"ID":"3e9d7a72-97a6-44c1-98b6-44b534c54d6a","Type":"ContainerDied","Data":"f275b890118a9f02cb366f201f3701817d1b85e87be5e5569462952aa0c235d6"} Apr 16 13:23:55.327574 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:55.327532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" event={"ID":"3e9d7a72-97a6-44c1-98b6-44b534c54d6a","Type":"ContainerStarted","Data":"00cfa51e16c8510915bb23deebde042717f9b1aea48b2f7c7f5817f7bf32cc7e"} Apr 16 13:23:55.328027 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:55.327779 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:23:55.345250 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:23:55.345197 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" podStartSLOduration=0.963440267 podStartE2EDuration="16.345181698s" podCreationTimestamp="2026-04-16 13:23:39 +0000 UTC" firstStartedPulling="2026-04-16 13:23:39.818686552 +0000 UTC m=+734.408063618" lastFinishedPulling="2026-04-16 13:23:55.200427983 +0000 UTC m=+749.789805049" observedRunningTime="2026-04-16 13:23:55.342974943 +0000 UTC m=+749.932352056" watchObservedRunningTime="2026-04-16 13:23:55.345181698 +0000 UTC m=+749.934558786" Apr 16 13:24:06.352157 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:06.352105 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-66zqc" Apr 16 13:24:37.998249 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:37.998210 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8"] Apr 16 13:24:38.001562 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.001528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.003758 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.003729 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 13:24:38.008838 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.008811 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8"] Apr 16 13:24:38.095607 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.095568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm7vq\" (UniqueName: \"kubernetes.io/projected/aca8b838-452e-4414-9a92-f3ffef31724a-kube-api-access-vm7vq\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.095607 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.095607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.095823 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.095645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aca8b838-452e-4414-9a92-f3ffef31724a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.095823 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.095681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.095823 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.095745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.095823 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.095816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197061 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.196978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197061 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vm7vq\" (UniqueName: \"kubernetes.io/projected/aca8b838-452e-4414-9a92-f3ffef31724a-kube-api-access-vm7vq\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197309 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197309 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aca8b838-452e-4414-9a92-f3ffef31724a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197309 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197309 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197501 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197562 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.197666 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.197643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.199355 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.199333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aca8b838-452e-4414-9a92-f3ffef31724a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.199695 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.199677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aca8b838-452e-4414-9a92-f3ffef31724a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.204342 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.204319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm7vq\" (UniqueName: \"kubernetes.io/projected/aca8b838-452e-4414-9a92-f3ffef31724a-kube-api-access-vm7vq\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8\" (UID: \"aca8b838-452e-4414-9a92-f3ffef31724a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.314468 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.314384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:38.446786 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.446740 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8"] Apr 16 13:24:38.450039 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:24:38.450008 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca8b838_452e_4414_9a92_f3ffef31724a.slice/crio-e35e3cde39729d262a7cbf2aba7c136afa75314775c0a8704ae702190870119c WatchSource:0}: Error finding container e35e3cde39729d262a7cbf2aba7c136afa75314775c0a8704ae702190870119c: Status 404 returned error can't find the container with id e35e3cde39729d262a7cbf2aba7c136afa75314775c0a8704ae702190870119c Apr 16 13:24:38.475364 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:38.475333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" event={"ID":"aca8b838-452e-4414-9a92-f3ffef31724a","Type":"ContainerStarted","Data":"e35e3cde39729d262a7cbf2aba7c136afa75314775c0a8704ae702190870119c"} Apr 16 13:24:39.480391 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:39.480353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" event={"ID":"aca8b838-452e-4414-9a92-f3ffef31724a","Type":"ContainerStarted","Data":"d2d4220cf0f878d9b7c3c6e4807815cbb7b48611b78167dd95d4fec5160452fe"} Apr 16 13:24:46.504511 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:46.504478 2575 generic.go:358] "Generic (PLEG): container finished" podID="aca8b838-452e-4414-9a92-f3ffef31724a" containerID="d2d4220cf0f878d9b7c3c6e4807815cbb7b48611b78167dd95d4fec5160452fe" exitCode=0 Apr 16 13:24:46.504857 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:46.504548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" event={"ID":"aca8b838-452e-4414-9a92-f3ffef31724a","Type":"ContainerDied","Data":"d2d4220cf0f878d9b7c3c6e4807815cbb7b48611b78167dd95d4fec5160452fe"} Apr 16 13:24:47.512291 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:47.512258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" event={"ID":"aca8b838-452e-4414-9a92-f3ffef31724a","Type":"ContainerStarted","Data":"2e767e582050ff64b593e572ab8e9d3a195562acbd27bb2b43d10e72c4886d9a"} Apr 16 13:24:47.512697 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:47.512553 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:24:47.529074 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:47.529026 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" podStartSLOduration=10.300046807 podStartE2EDuration="10.529012953s" podCreationTimestamp="2026-04-16 13:24:37 +0000 UTC" firstStartedPulling="2026-04-16 13:24:46.505175699 +0000 UTC m=+801.094552765" lastFinishedPulling="2026-04-16 13:24:46.734141837 +0000 UTC m=+801.323518911" observedRunningTime="2026-04-16 13:24:47.527362304 +0000 UTC m=+802.116739401" watchObservedRunningTime="2026-04-16 13:24:47.529012953 +0000 UTC m=+802.118390037" Apr 16 13:24:58.529788 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:24:58.529759 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8" Apr 16 13:26:25.968395 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:26:25.968365 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:26:25.969519 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:26:25.969499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:31:25.988971 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:31:25.988939 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:31:25.990979 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:31:25.990957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:36:26.009902 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:36:26.009871 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:36:26.011572 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:36:26.011550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:41:26.032991 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:41:26.032959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:41:26.036302 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:41:26.036282 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:44:12.762452 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:12.762421 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-dmcnl_4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b/manager/0.log" Apr 16 13:44:12.993281 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:12.993249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7565bd457c-8x8ds_75188ba9-7fa3-4368-b24a-d60610f84e4e/maas-api/0.log" Apr 16 13:44:13.357174 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:13.357140 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7mwdl_d1922597-28ff-467e-96f9-a3557c298089/manager/2.log" Apr 16 13:44:13.529105 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:13.529073 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5889847794-7ltx2_e85e7df6-cc76-468f-83fd-9907ddde903f/manager/0.log" Apr 16 13:44:14.065540 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:14.065511 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-st9jq_ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3/postgres/0.log" Apr 16 13:44:16.722357 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:16.722329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4hhfl_1f1a3347-d911-4625-a33f-801bd0705e43/manager/0.log" Apr 16 13:44:18.085145 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:18.085102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-g986f_57308b09-bbb5-4479-bffd-7c974dfe32af/discovery/0.log" Apr 16 13:44:18.205693 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:18.205658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7485ccd7bf-5qzpc_74e58751-2715-4b52-b472-4dc7b18630f7/kube-auth-proxy/0.log" Apr 16 13:44:18.992575 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:18.992548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-66zqc_3e9d7a72-97a6-44c1-98b6-44b534c54d6a/storage-initializer/0.log" Apr 16 13:44:18.999968 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:18.999944 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-66zqc_3e9d7a72-97a6-44c1-98b6-44b534c54d6a/main/0.log" Apr 16 13:44:19.551981 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:19.551955 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8_aca8b838-452e-4414-9a92-f3ffef31724a/storage-initializer/0.log" Apr 16 13:44:19.558732 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:19.558711 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-gbkl8_aca8b838-452e-4414-9a92-f3ffef31724a/main/0.log" Apr 16 13:44:26.524231 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:26.524198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hqc54_454de3ce-a596-4780-b1b7-e2fe418de97e/global-pull-secret-syncer/0.log" Apr 16 13:44:26.620850 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:26.620823 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-glspk_21bff1bd-c48b-4833-b3b3-ce5f1230db72/konnectivity-agent/0.log" Apr 16 13:44:26.715846 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:26.715817 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-166.ec2.internal_b77342772725edf19265f6dcbdde9121/haproxy/0.log" Apr 16 13:44:30.974299 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:30.974268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4hhfl_1f1a3347-d911-4625-a33f-801bd0705e43/manager/0.log" Apr 16 13:44:33.138011 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:33.137943 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hgwk4_5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c/node-exporter/0.log" Apr 16 13:44:33.157972 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:33.157947 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hgwk4_5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c/kube-rbac-proxy/0.log" Apr 16 13:44:33.180544 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:33.180518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hgwk4_5c5fd0e8-bce8-4b35-8e74-ac10bf55c92c/init-textfile/0.log" Apr 16 13:44:35.289592 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.289559 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc"] Apr 16 13:44:35.292822 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.292801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.294630 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.294608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbjl6\"/\"kube-root-ca.crt\"" Apr 16 13:44:35.294740 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.294688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbjl6\"/\"openshift-service-ca.crt\"" Apr 16 13:44:35.295059 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.295040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gbjl6\"/\"default-dockercfg-tfgq9\"" Apr 16 13:44:35.301947 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.301924 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc"] Apr 16 13:44:35.316774 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.316751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbp8l\" (UniqueName: \"kubernetes.io/projected/449335c4-96f6-4991-ad3c-4899fb82582c-kube-api-access-bbp8l\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.316898 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.316781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-sys\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.316898 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.316799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-lib-modules\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.316898 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.316864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-podres\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.317052 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.316954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-proc\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.417676 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.417634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbp8l\" (UniqueName: \"kubernetes.io/projected/449335c4-96f6-4991-ad3c-4899fb82582c-kube-api-access-bbp8l\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.418291 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.418267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-sys\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.418559 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.418544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-lib-modules\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.418786 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.418500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-sys\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.419086 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.418750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-lib-modules\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.419086 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.418931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-podres\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.419245 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.419111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-proc\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.419245 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.419022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-podres\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.419245 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.419215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/449335c4-96f6-4991-ad3c-4899fb82582c-proc\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.425222 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.425200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbp8l\" (UniqueName: \"kubernetes.io/projected/449335c4-96f6-4991-ad3c-4899fb82582c-kube-api-access-bbp8l\") pod \"perf-node-gather-daemonset-kdnmc\" (UID: \"449335c4-96f6-4991-ad3c-4899fb82582c\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.604089 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.603994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:35.721431 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.721406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc"] Apr 16 13:44:35.724054 ip-10-0-142-166 kubenswrapper[2575]: W0416 13:44:35.724026 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod449335c4_96f6_4991_ad3c_4899fb82582c.slice/crio-46bce47b861e07e04678b00938d8c224868ce89382c49d14c946ac311581d8b5 WatchSource:0}: Error finding container 46bce47b861e07e04678b00938d8c224868ce89382c49d14c946ac311581d8b5: Status 404 returned error can't find the container with id 46bce47b861e07e04678b00938d8c224868ce89382c49d14c946ac311581d8b5 Apr 16 13:44:35.725681 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:35.725660 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:44:36.353711 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:36.353664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" event={"ID":"449335c4-96f6-4991-ad3c-4899fb82582c","Type":"ContainerStarted","Data":"d8ee6e1c3294972321a947c124c19e1e4bf97ab629c2f836eafa09590453f1fb"} Apr 16 13:44:36.353711 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:36.353708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" event={"ID":"449335c4-96f6-4991-ad3c-4899fb82582c","Type":"ContainerStarted","Data":"46bce47b861e07e04678b00938d8c224868ce89382c49d14c946ac311581d8b5"} Apr 16 13:44:36.354242 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:36.353817 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:36.368892 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:36.368853 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" podStartSLOduration=1.3688391659999999 podStartE2EDuration="1.368839166s" podCreationTimestamp="2026-04-16 13:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:44:36.36688507 +0000 UTC m=+1990.956262168" watchObservedRunningTime="2026-04-16 13:44:36.368839166 +0000 UTC m=+1990.958216253" Apr 16 13:44:37.349844 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:37.349812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-msm5w_5fdfc31d-52a5-4228-aa8d-7f803085d57e/dns/0.log" Apr 16 13:44:37.366775 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:37.366734 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-msm5w_5fdfc31d-52a5-4228-aa8d-7f803085d57e/kube-rbac-proxy/0.log" Apr 16 13:44:37.423044 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:37.423021 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g4rk4_5e2c6e75-298e-4014-bf52-0dc9f276e559/dns-node-resolver/0.log" Apr 16 13:44:38.021869 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:38.021844 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-psf9p_fd1864d2-5d1c-41a0-84d9-dd4835e795d5/node-ca/0.log" Apr 16 13:44:38.906793 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:38.906758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-g986f_57308b09-bbb5-4479-bffd-7c974dfe32af/discovery/0.log" Apr 16 13:44:38.924626 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:38.924603 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7485ccd7bf-5qzpc_74e58751-2715-4b52-b472-4dc7b18630f7/kube-auth-proxy/0.log" Apr 16 13:44:39.531027 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:39.531002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9r95z_7b5d7aa9-dd9d-487f-844d-3f40b038a994/serve-healthcheck-canary/0.log" Apr 16 13:44:40.039946 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:40.039917 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2b8cf_ba1775da-40a0-4aa2-bb2e-895725999757/kube-rbac-proxy/0.log" Apr 16 13:44:40.056961 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:40.056931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2b8cf_ba1775da-40a0-4aa2-bb2e-895725999757/exporter/0.log" Apr 16 13:44:40.074177 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:40.074147 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2b8cf_ba1775da-40a0-4aa2-bb2e-895725999757/extractor/0.log" Apr 16 13:44:42.091862 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.091823 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-dmcnl_4a0e0fd6-c6eb-4ce0-85c4-d5d7c07cc54b/manager/0.log" Apr 16 13:44:42.125191 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.125157 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7565bd457c-8x8ds_75188ba9-7fa3-4368-b24a-d60610f84e4e/maas-api/0.log" Apr 16 13:44:42.190881 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.190850 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7mwdl_d1922597-28ff-467e-96f9-a3557c298089/manager/1.log" Apr 16 13:44:42.201859 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.201835 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7mwdl_d1922597-28ff-467e-96f9-a3557c298089/manager/2.log" Apr 16 13:44:42.237258 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.237227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5889847794-7ltx2_e85e7df6-cc76-468f-83fd-9907ddde903f/manager/0.log" Apr 16 13:44:42.309974 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.309948 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-st9jq_ea62fa2b-68c7-4401-8b30-b85e7ebaa0c3/postgres/0.log" Apr 16 13:44:42.369524 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:42.369441 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-kdnmc" Apr 16 13:44:49.409987 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.409957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/kube-multus-additional-cni-plugins/0.log" Apr 16 13:44:49.428368 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.428344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/egress-router-binary-copy/0.log" Apr 16 13:44:49.450665 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.450636 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/cni-plugins/0.log" Apr 16 13:44:49.468101 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.468078 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/bond-cni-plugin/0.log" Apr 16 13:44:49.486256 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.486232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/routeoverride-cni/0.log" Apr 16 13:44:49.503809 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.503785 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/whereabouts-cni-bincopy/0.log" Apr 16 13:44:49.521544 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.521518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-66xwt_8156b1b5-dde0-4575-9f54-ea6d2acf9495/whereabouts-cni/0.log" Apr 16 13:44:49.880904 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.880874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wfgv8_c41aae66-1614-4b68-99e9-dae826ba8bff/kube-multus/0.log" Apr 16 13:44:49.898304 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.898279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9b59n_0865faea-916d-435f-88f5-d2b559f1d79a/network-metrics-daemon/0.log" Apr 16 13:44:49.916420 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:49.916382 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9b59n_0865faea-916d-435f-88f5-d2b559f1d79a/kube-rbac-proxy/0.log" Apr 16 13:44:51.192484 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.191716 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-controller/0.log" Apr 16 13:44:51.211569 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.211541 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/0.log" Apr 16 13:44:51.220797 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.220771 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovn-acl-logging/1.log" Apr 16 13:44:51.237764 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.237733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/kube-rbac-proxy-node/0.log" Apr 16 13:44:51.256018 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.255988 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 13:44:51.270699 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.270671 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/northd/0.log" Apr 16 13:44:51.288441 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.288417 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/nbdb/0.log" Apr 16 13:44:51.305540 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.305515 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/sbdb/0.log" Apr 16 13:44:51.390921 ip-10-0-142-166 kubenswrapper[2575]: I0416 13:44:51.390893 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jzvgp_25d5ba90-b543-425c-9992-d5d1d1a63331/ovnkube-controller/0.log"