Apr 16 22:11:04.565241 ip-10-0-142-35 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:11:04.565256 ip-10-0-142-35 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:11:04.565266 ip-10-0-142-35 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:11:04.565597 ip-10-0-142-35 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:11:15.967625 ip-10-0-142-35 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:11:15.967643 ip-10-0-142-35 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0a20e36b0e224d548807773935493792 -- Apr 16 22:13:45.069195 ip-10-0-142-35 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:45.531254 ip-10-0-142-35 kubenswrapper[2560]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:45.531254 ip-10-0-142-35 kubenswrapper[2560]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:45.531254 ip-10-0-142-35 kubenswrapper[2560]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:45.531254 ip-10-0-142-35 kubenswrapper[2560]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:45.531254 ip-10-0-142-35 kubenswrapper[2560]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:45.532272 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.532188 2560 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:45.539291 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539268 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:45.539291 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539287 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:45.539291 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539290 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:45.539291 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539293 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:45.539291 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539296 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:45.539291 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539299 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539302 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539305 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539308 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539310 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539313 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539315 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539318 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539321 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539323 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539326 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539329 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539331 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539333 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539336 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539338 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539341 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539344 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539346 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539349 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:45.539509 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539351 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539354 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539356 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539359 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539362 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539364 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539367 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539371 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539375 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539378 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539380 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539383 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539385 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539388 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539392 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539395 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539397 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539400 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539403 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539405 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:45.539999 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539408 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539411 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539413 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539416 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539419 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539422 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539424 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539427 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539429 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539432 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539435 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539437 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539442 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539445 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539448 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539451 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539453 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539456 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539459 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:45.540493 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539463 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539467 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539471 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539475 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539477 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539480 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539483 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539487 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539490 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539493 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539496 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539498 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539501 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539504 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539508 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539511 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539513 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539516 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539520 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:45.540984 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539523 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539526 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539528 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539953 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539959 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539963 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539965 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539968 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539971 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539974 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539976 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539979 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539982 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539984 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539987 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539990 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539992 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539995 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.539998 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540001 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:45.541439 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540004 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540008 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540011 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540015 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540018 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540020 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540024 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540027 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540030 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540032 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540035 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540038 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540040 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540043 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540045 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540048 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540050 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540053 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540055 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:45.541938 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540058 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540061 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540063 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540066 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540068 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540072 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540075 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540077 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540080 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540082 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540085 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540087 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540090 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540093 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540095 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540098 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540100 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540103 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540105 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540108 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:45.542492 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540110 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540113 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540116 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540118 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540121 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540123 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540126 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540128 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540131 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540133 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540136 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540138 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540141 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540146 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540148 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540151 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540153 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540156 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540158 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540161 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:45.542998 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540163 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540166 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540168 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540171 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540173 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540176 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540180 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540182 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540185 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.540189 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540933 2560 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540948 2560 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540955 2560 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540959 2560 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540968 2560 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540971 2560 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540976 2560 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540981 2560 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540984 2560 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540987 2560 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540991 2560 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:45.543486 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540995 2560 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.540998 2560 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541001 2560 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541004 2560 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541007 2560 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541010 2560 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541013 2560 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541016 2560 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541021 2560 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541024 2560 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541027 2560 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541029 2560 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541033 2560 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541037 2560 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541040 2560 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541044 2560 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541048 2560 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541051 2560 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541054 2560 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541057 2560 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541060 2560 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541063 2560 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541067 2560 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541070 2560 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541073 2560 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:45.544023 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541076 2560 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541079 2560 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541082 2560 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541087 2560 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541090 2560 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541093 2560 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541096 2560 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541099 2560 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541103 2560 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541106 2560 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541109 2560 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541112 2560 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541115 2560 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541118 2560 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541121 2560 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541124 2560 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541127 2560 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541130 2560 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541133 2560 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541137 2560 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541140 2560 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541143 2560 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541147 2560 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541150 2560 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541153 2560 flags.go:64] FLAG: --help="false" Apr 16 22:13:45.544652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541156 2560 flags.go:64] FLAG: --hostname-override="ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541159 2560 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541162 2560 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541165 2560 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541169 2560 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541173 2560 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541176 2560 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541179 2560 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541182 2560 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541185 2560 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541188 2560 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541192 2560 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541195 2560 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541198 2560 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541201 2560 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541204 2560 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541207 2560 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541210 2560 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541213 2560 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541216 2560 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541219 2560 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541225 2560 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541228 2560 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541231 2560 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:45.545307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541233 2560 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541236 2560 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541240 2560 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541242 2560 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541245 2560 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541250 2560 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541253 2560 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541257 2560 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541260 2560 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541264 2560 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541267 2560 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541269 2560 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541273 2560 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541276 2560 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541279 2560 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541286 2560 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541289 2560 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541292 2560 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541295 2560 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541299 2560 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541304 2560 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541307 2560 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541310 2560 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541313 2560 flags.go:64] FLAG: --port="10250" Apr 16 22:13:45.545892 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541316 2560 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541319 2560 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bef753d59c7e5e6b" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541322 2560 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541325 2560 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541328 2560 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541331 2560 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541334 2560 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541338 2560 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541340 2560 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541343 2560 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541346 2560 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541353 2560 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541356 2560 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541359 2560 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541361 2560 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541364 2560 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541367 2560 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541370 2560 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541373 2560 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541376 2560 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541379 2560 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541382 2560 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541385 2560 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541388 2560 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541391 2560 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541394 2560 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:45.546525 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541397 2560 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541400 2560 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541405 2560 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541408 2560 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541411 2560 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541416 2560 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541419 2560 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541422 2560 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541426 2560 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541429 2560 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541432 2560 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541435 2560 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541437 2560 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541441 2560 flags.go:64] FLAG: --v="2" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541445 2560 flags.go:64] FLAG: --version="false" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541450 2560 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541454 2560 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.541458 2560 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541570 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541575 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541579 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541582 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541585 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:45.547193 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541588 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541590 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541593 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541596 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541598 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541601 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541603 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541606 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541609 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541611 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541614 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541618 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541621 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541624 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541627 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541629 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541632 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541634 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541638 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:45.547789 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541642 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541644 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541647 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541650 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541652 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541655 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541658 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541661 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541664 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541668 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541671 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541674 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541676 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541679 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541682 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541685 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541687 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541690 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541693 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541695 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:45.548278 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541698 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541700 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541703 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541706 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541709 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541712 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541715 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541718 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541720 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541723 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541726 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541728 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541731 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541734 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541736 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541740 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541743 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541746 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541748 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:45.548777 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541751 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541753 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541757 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541760 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541763 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541765 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541767 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541770 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541773 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541775 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541778 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541780 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541783 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541785 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541788 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541790 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541792 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541796 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541799 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541802 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:45.549262 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541805 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:45.549756 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541807 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:45.549756 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.541810 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:45.549756 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.543022 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:45.550343 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.550321 2560 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:45.550382 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.550345 2560 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:45.550413 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550395 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:45.550413 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550401 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:45.550413 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550405 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:45.550413 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550409 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:45.550413 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550413 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550416 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550419 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550422 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550424 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550428 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550431 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550435 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550439 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550442 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550447 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550451 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550454 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550457 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550459 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550462 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550465 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550468 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550470 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:45.550542 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550473 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550476 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550478 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550481 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550484 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550487 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550489 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550492 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550495 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550499 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550501 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550503 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550506 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550509 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550512 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550515 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550517 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550520 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550523 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550525 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:45.551086 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550527 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550530 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550532 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550535 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550538 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550541 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550543 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550546 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550549 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550552 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550555 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550557 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550560 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550562 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550565 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550567 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550570 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550573 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550576 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550579 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:45.551615 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550581 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550585 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550587 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550590 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550592 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550595 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550598 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550600 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550603 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550606 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550608 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550611 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550613 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550617 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550620 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550622 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550625 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550627 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550630 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550632 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:45.552140 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550635 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550637 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550640 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.550645 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550741 2560 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550745 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550749 2560 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550752 2560 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550755 2560 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550758 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550761 2560 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550763 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550766 2560 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550768 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550771 2560 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550774 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:45.552633 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550777 2560 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550779 2560 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550782 2560 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550784 2560 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550787 2560 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550790 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550793 2560 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550795 2560 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550798 2560 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550801 2560 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550803 2560 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550806 2560 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550809 2560 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550811 2560 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550814 2560 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550817 2560 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550819 2560 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550821 2560 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550824 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550826 2560 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:45.553057 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550829 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550832 2560 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550834 2560 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550837 2560 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550839 2560 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550842 2560 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550844 2560 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550847 2560 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550849 2560 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550851 2560 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550854 2560 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550857 2560 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550860 2560 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550862 2560 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550865 2560 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550879 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550882 2560 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550885 2560 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550888 2560 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550890 2560 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:45.553547 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550893 2560 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550896 2560 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550898 2560 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550901 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550903 2560 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550906 2560 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550909 2560 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550913 2560 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550916 2560 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550919 2560 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550922 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550924 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550927 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550929 2560 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550932 2560 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550934 2560 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550937 2560 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550939 2560 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550942 2560 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:45.554076 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550946 2560 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550948 2560 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550951 2560 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550954 2560 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550956 2560 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550959 2560 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550962 2560 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550965 2560 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550968 2560 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550971 2560 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550973 2560 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550976 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550978 2560 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550982 2560 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:45.550984 2560 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.550989 2560 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:45.554574 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.551822 2560 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:45.555082 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.553825 2560 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:45.556001 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.555989 2560 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:45.556104 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.556090 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:45.556138 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.556127 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:45.581305 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.581285 2560 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:45.584039 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.584019 2560 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:45.597321 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.597296 2560 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:45.603493 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.603475 2560 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:45.605064 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.605046 2560 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:45.607723 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.607706 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:45.608484 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.608463 2560 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 af364aaa-fc41-4f03-a7da-48e7dd2e72a9:/dev/nvme0n1p4 e3b77852-7466-4d11-a7e7-9a18ad476ae3:/dev/nvme0n1p3] Apr 16 22:13:45.608531 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.608485 2560 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:45.613739 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.613627 2560 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:45.612363292 +0000 UTC m=+0.420616455 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093576 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d11ef355eb59badf4ceb9fc75f01d SystemUUID:ec2d11ef-355e-b59b-adf4-ceb9fc75f01d BootID:0a20e36b-0e22-4d54-8807-773935493792 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:55:32:79:a2:27 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:55:32:79:a2:27 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:b4:7d:df:58:3b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:45.613739 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.613735 2560 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:45.613856 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.613818 2560 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:45.615591 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.615566 2560 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:45.615740 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.615594 2560 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-35.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:45.615782 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.615751 2560 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:45.615782 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.615760 2560 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:45.615782 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.615773 2560 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:45.615881 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.615785 2560 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:45.617172 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.617161 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:45.617277 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.617268 2560 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:45.619807 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.619796 2560 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:45.619838 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.619812 2560 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:45.619838 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.619824 2560 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:45.619838 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.619834 2560 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:45.619965 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.619848 2560 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:45.620933 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.620920 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:45.620973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.620945 2560 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:45.624282 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.624265 2560 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:45.625636 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.625619 2560 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:45.627701 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627686 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627707 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627716 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627725 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627747 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627757 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627766 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:45.627772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627774 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:45.628061 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627784 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:45.628061 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627836 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:45.628061 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627885 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:45.628061 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.627900 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:45.628765 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.628754 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:45.628824 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.628769 2560 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:45.629981 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.629960 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-35.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:45.630097 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.629960 2560 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:45.630300 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.630282 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gdk64" Apr 16 22:13:45.632664 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.632648 2560 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:45.632758 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.632694 2560 server.go:1295] "Started kubelet" Apr 16 22:13:45.632825 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.632790 2560 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:45.635662 ip-10-0-142-35 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:45.635853 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.635629 2560 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:45.635853 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.635719 2560 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:45.637627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.637575 2560 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:45.638439 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.638412 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gdk64" Apr 16 22:13:45.639259 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.639243 2560 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:45.640586 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.640568 2560 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-35.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:45.641601 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.640617 2560 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-35.ec2.internal.18a6f609f2c96130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-35.ec2.internal,UID:ip-10-0-142-35.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-35.ec2.internal,},FirstTimestamp:2026-04-16 22:13:45.632661808 +0000 UTC m=+0.440914952,LastTimestamp:2026-04-16 22:13:45.632661808 +0000 UTC m=+0.440914952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-35.ec2.internal,}" Apr 16 22:13:45.642702 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.642682 2560 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:45.643946 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.643929 2560 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:45.644031 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.643949 2560 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:45.644588 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644571 2560 factory.go:55] Registering systemd factory Apr 16 22:13:45.644588 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644582 2560 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:45.644588 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644590 2560 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:45.644765 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644584 2560 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:45.644765 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644606 2560 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:45.644765 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644731 2560 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:45.644765 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644739 2560 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:45.644973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644782 2560 factory.go:153] Registering CRI-O factory Apr 16 22:13:45.644973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644799 2560 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:45.644973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644848 2560 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:45.644973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644892 2560 factory.go:103] Registering Raw factory Apr 16 22:13:45.644973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.644909 2560 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:45.645223 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.645165 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:45.645337 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.645320 2560 manager.go:319] Starting recovery of all containers Apr 16 22:13:45.653215 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.653066 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:45.655565 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.655537 2560 manager.go:324] Recovery completed Apr 16 22:13:45.656259 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.656145 2560 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-35.ec2.internal\" not found" node="ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.656703 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.656686 2560 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 22:13:45.660928 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.660913 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:45.664247 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.664227 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:45.664337 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.664264 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:45.664337 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.664277 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:45.665151 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.665135 2560 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:45.665151 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.665148 2560 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:45.665284 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.665169 2560 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:45.667571 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.667557 2560 policy_none.go:49] "None policy: Start" Apr 16 22:13:45.667647 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.667576 2560 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:45.667647 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.667589 2560 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698319 2560 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.698354 2560 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698365 2560 server.go:85] "Starting device plugin registration server" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698626 2560 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698638 2560 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698787 2560 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698893 2560 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.698903 2560 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.700111 2560 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:45.710830 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.700150 2560 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:45.770931 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.770889 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:45.772142 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.772125 2560 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:45.772221 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.772157 2560 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:45.772221 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.772183 2560 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:45.772221 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.772192 2560 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:45.772368 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.772232 2560 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:45.775454 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.775432 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:45.799369 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.799315 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:45.800939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.800923 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:45.801027 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.800951 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:45.801027 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.800961 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:45.801027 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.800984 2560 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.809359 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.809345 2560 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.809407 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.809366 2560 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-35.ec2.internal\": node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:45.824856 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.824828 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:45.872500 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.872457 2560 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal"] Apr 16 22:13:45.872640 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.872558 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:45.874032 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.874015 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:45.874133 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.874043 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:45.874133 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.874053 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:45.875348 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.875336 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:45.875512 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.875499 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.875561 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.875527 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:45.876039 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.876021 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:45.876129 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.876048 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:45.876129 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.876057 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:45.876129 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.876089 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:45.876129 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.876107 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:45.876129 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.876117 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:45.877299 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.877286 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.877346 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.877308 2560 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:45.877930 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.877915 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:45.878009 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.877948 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:45.878009 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.877961 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:45.901647 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.901624 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-35.ec2.internal\" not found" node="ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.906189 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.906174 2560 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-35.ec2.internal\" not found" node="ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.925455 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:45.925432 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:45.946593 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.946569 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7ebb63d89fc306c77bf2d82a3bc33b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal\" (UID: \"e7ebb63d89fc306c77bf2d82a3bc33b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.946673 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.946597 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7ebb63d89fc306c77bf2d82a3bc33b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal\" (UID: \"e7ebb63d89fc306c77bf2d82a3bc33b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:45.946673 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:45.946616 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e12fe8cf6bc72ee22e06a7c9a61455c5-config\") pod \"kube-apiserver-proxy-ip-10-0-142-35.ec2.internal\" (UID: \"e12fe8cf6bc72ee22e06a7c9a61455c5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.026395 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.026357 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.046729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.046709 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e12fe8cf6bc72ee22e06a7c9a61455c5-config\") pod \"kube-apiserver-proxy-ip-10-0-142-35.ec2.internal\" (UID: \"e12fe8cf6bc72ee22e06a7c9a61455c5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.046818 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.046748 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7ebb63d89fc306c77bf2d82a3bc33b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal\" (UID: \"e7ebb63d89fc306c77bf2d82a3bc33b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.046818 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.046768 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7ebb63d89fc306c77bf2d82a3bc33b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal\" (UID: \"e7ebb63d89fc306c77bf2d82a3bc33b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.046914 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.046825 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e12fe8cf6bc72ee22e06a7c9a61455c5-config\") pod \"kube-apiserver-proxy-ip-10-0-142-35.ec2.internal\" (UID: \"e12fe8cf6bc72ee22e06a7c9a61455c5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.046914 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.046891 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7ebb63d89fc306c77bf2d82a3bc33b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal\" (UID: \"e7ebb63d89fc306c77bf2d82a3bc33b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.046983 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.046924 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7ebb63d89fc306c77bf2d82a3bc33b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal\" (UID: \"e7ebb63d89fc306c77bf2d82a3bc33b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.127118 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.127084 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.203660 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.203626 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.208375 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.208356 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" Apr 16 22:13:46.228215 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.228188 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.328754 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.328707 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.429270 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.429198 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.467208 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.467182 2560 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:46.530354 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.530319 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.554809 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.554777 2560 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:46.555246 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.554950 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:46.555246 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.554957 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:46.555246 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.554959 2560 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:46.631341 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.631312 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.640485 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.640439 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:45 +0000 UTC" deadline="2027-12-25 21:22:06.617119858 +0000 UTC" Apr 16 22:13:46.640485 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.640484 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14831h8m19.976640977s" Apr 16 22:13:46.644307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.644284 2560 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:46.658560 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.658523 2560 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:46.682856 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.682795 2560 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9xv95" Apr 16 22:13:46.689995 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.689976 2560 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9xv95" Apr 16 22:13:46.717860 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:46.717826 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7ebb63d89fc306c77bf2d82a3bc33b8.slice/crio-00a251167cba6221b1f5793c6b16c540802589d3bfb2483ef4f7e2641e82acf6 WatchSource:0}: Error finding container 00a251167cba6221b1f5793c6b16c540802589d3bfb2483ef4f7e2641e82acf6: Status 404 returned error can't find the container with id 00a251167cba6221b1f5793c6b16c540802589d3bfb2483ef4f7e2641e82acf6 Apr 16 22:13:46.718089 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:46.718066 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12fe8cf6bc72ee22e06a7c9a61455c5.slice/crio-f55d856d29ccab9067139ba30a9bb2065ac7b4e66caa4cbbae98a1e265622038 WatchSource:0}: Error finding container f55d856d29ccab9067139ba30a9bb2065ac7b4e66caa4cbbae98a1e265622038: Status 404 returned error can't find the container with id f55d856d29ccab9067139ba30a9bb2065ac7b4e66caa4cbbae98a1e265622038 Apr 16 22:13:46.722758 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.722740 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:46.731804 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.731785 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.774978 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.774922 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" event={"ID":"e7ebb63d89fc306c77bf2d82a3bc33b8","Type":"ContainerStarted","Data":"00a251167cba6221b1f5793c6b16c540802589d3bfb2483ef4f7e2641e82acf6"} Apr 16 22:13:46.775860 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:46.775836 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" event={"ID":"e12fe8cf6bc72ee22e06a7c9a61455c5","Type":"ContainerStarted","Data":"f55d856d29ccab9067139ba30a9bb2065ac7b4e66caa4cbbae98a1e265622038"} Apr 16 22:13:46.832045 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.832006 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:46.932569 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:46.932537 2560 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-35.ec2.internal\" not found" Apr 16 22:13:47.003595 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.003511 2560 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:47.044108 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.044076 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" Apr 16 22:13:47.056142 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.056112 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:47.057019 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.057005 2560 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" Apr 16 22:13:47.063476 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.063460 2560 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:47.620490 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.620463 2560 apiserver.go:52] "Watching apiserver" Apr 16 22:13:47.627604 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.627583 2560 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:47.628036 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.628014 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-4frh7","kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx","openshift-dns/node-resolver-p58zd","openshift-image-registry/node-ca-bn8st","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal","openshift-multus/multus-4gzw4","openshift-multus/network-metrics-daemon-cph62","openshift-cluster-node-tuning-operator/tuned-7zwqh","openshift-multus/multus-additional-cni-plugins-9sv6h","openshift-network-diagnostics/network-check-target-qfmn8","openshift-network-operator/iptables-alerter-249mr","openshift-ovn-kubernetes/ovnkube-node-5ggql"] Apr 16 22:13:47.629552 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.629524 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.630609 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.630586 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.631956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.631916 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:47.632233 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.632214 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.632399 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.632379 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:47.632451 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.632433 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5wclm\"" Apr 16 22:13:47.632517 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.632487 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.633005 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.632987 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.633106 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.633075 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:47.633229 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.633210 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.633229 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.633226 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p97zh\"" Apr 16 22:13:47.633353 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.633280 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.634601 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.634580 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.635310 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.635290 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4bxft\"" Apr 16 22:13:47.635405 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.635312 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.635405 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.635330 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.636125 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.635934 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.636125 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.635995 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:47.636125 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.636061 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:47.636581 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.636539 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qqddp\"" Apr 16 22:13:47.636668 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.636590 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.636668 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.636609 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.636782 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.636545 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:47.637729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.637444 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.638093 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.638074 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-r2mbz\"" Apr 16 22:13:47.638259 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.638240 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:47.638330 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.638282 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:47.639066 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.639051 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.639595 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.639577 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.639702 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.639686 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.639958 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.639936 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2shqw\"" Apr 16 22:13:47.640353 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.640330 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:47.640440 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.640420 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:47.641070 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.641044 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:47.641243 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.641143 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:47.641456 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.641437 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mzrbl\"" Apr 16 22:13:47.644354 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.644334 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.646601 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.646578 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.647436 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.647416 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.647668 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.647646 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.647732 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.647676 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-thbxz\"" Apr 16 22:13:47.647786 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.647774 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:47.648663 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.648642 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:47.648775 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.648663 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:47.648923 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.648901 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:47.649016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.648948 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:47.649389 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.649369 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dk4kq\"" Apr 16 22:13:47.649485 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.649372 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:47.649485 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.649455 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:47.654362 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654339 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-systemd-units\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.654457 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654376 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-systemd\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.654457 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654399 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-kubelet\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.654457 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654421 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-etc-kubernetes\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.654457 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654445 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/81cb0b16-9454-40e0-907f-db4de6741a0c-tmp-dir\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.654655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654468 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysctl-d\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.654655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654493 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:47.654655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654520 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-slash\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.654655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654546 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mb7n\" (UniqueName: \"kubernetes.io/projected/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-kube-api-access-6mb7n\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.654655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654585 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-cni-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.654655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654609 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-cni-binary-copy\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654657 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-cni-multus\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654688 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654718 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-registration-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654746 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-device-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654770 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfgd5\" (UniqueName: \"kubernetes.io/projected/81cb0b16-9454-40e0-907f-db4de6741a0c-kube-api-access-hfgd5\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654793 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/11442be0-1bee-4a8d-8374-44ea164b6268-serviceca\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654814 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysconfig\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654839 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysctl-conf\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654864 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-run\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654909 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-run-netns\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654933 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovn-node-metrics-cert\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.654956 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654955 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-os-release\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.654978 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11442be0-1bee-4a8d-8374-44ea164b6268-host\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655004 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6rn\" (UniqueName: \"kubernetes.io/projected/f45f69f8-87a8-49f9-bb8b-485368427802-kube-api-access-9f6rn\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655027 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z572t\" (UniqueName: \"kubernetes.io/projected/40fbccdd-dab4-458a-93d4-a60daf555bc6-kube-api-access-z572t\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655071 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655117 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-kubelet\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655153 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-var-lib-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655185 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-system-cni-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655210 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-socket-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655257 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-cnibin\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655285 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-cni-bin\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655308 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-env-overrides\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655342 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-multus-certs\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655386 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhz2\" (UniqueName: \"kubernetes.io/projected/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-kube-api-access-vjhz2\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655411 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc7d6449-50b3-4246-90df-a37a0edc66d9-agent-certs\") pod \"konnectivity-agent-4frh7\" (UID: \"fc7d6449-50b3-4246-90df-a37a0edc66d9\") " pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.655464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655452 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655483 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-host-slash\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655518 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655543 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-cni-netd\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655567 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-k8s-cni-cncf-io\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655589 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-conf-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655611 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-sys\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655636 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-cni-bin\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655692 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-sys-fs\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655727 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/81cb0b16-9454-40e0-907f-db4de6741a0c-hosts-file\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655752 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-lib-modules\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655779 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jp7\" (UniqueName: \"kubernetes.io/projected/e68012e7-cccf-4da0-86e1-1355b80e2784-kube-api-access-92jp7\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655802 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-cnibin\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655841 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-socket-dir-parent\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655862 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-daemon-config\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655905 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc967\" (UniqueName: \"kubernetes.io/projected/11442be0-1bee-4a8d-8374-44ea164b6268-kube-api-access-cc967\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.656055 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655927 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-systemd\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655948 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.655975 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovnkube-script-lib\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656012 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-kubernetes\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656053 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-var-lib-kubelet\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656081 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7tk\" (UniqueName: \"kubernetes.io/projected/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-kube-api-access-sx7tk\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656107 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-ovn\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656131 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-log-socket\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656154 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4cf5\" (UniqueName: \"kubernetes.io/projected/5fbb596d-4520-40e0-a72b-a223546a6d8f-kube-api-access-b4cf5\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656178 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656218 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-etc-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656251 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-netns\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656282 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-modprobe-d\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656312 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-tuned\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656338 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656374 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-iptables-alerter-script\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.656727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656396 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-hostroot\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656420 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656467 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-node-log\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656495 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc7d6449-50b3-4246-90df-a37a0edc66d9-konnectivity-ca\") pod \"konnectivity-agent-4frh7\" (UID: \"fc7d6449-50b3-4246-90df-a37a0edc66d9\") " pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656518 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-host\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656543 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-cni-binary-copy\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656613 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656684 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovnkube-config\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656716 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/40fbccdd-dab4-458a-93d4-a60daf555bc6-tmp\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656740 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-system-cni-dir\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.657360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.656762 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-os-release\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.690966 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.690931 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:46 +0000 UTC" deadline="2027-11-26 04:50:47.711912142 +0000 UTC" Apr 16 22:13:47.690966 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.690957 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14118h37m0.020958219s" Apr 16 22:13:47.745239 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.745208 2560 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:47.757306 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757279 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-tuned\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.757306 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757309 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757326 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-iptables-alerter-script\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757341 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-hostroot\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757361 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757384 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-node-log\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757406 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc7d6449-50b3-4246-90df-a37a0edc66d9-konnectivity-ca\") pod \"konnectivity-agent-4frh7\" (UID: \"fc7d6449-50b3-4246-90df-a37a0edc66d9\") " pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757432 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-host\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757457 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-cni-binary-copy\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757454 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-hostroot\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757482 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757486 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757506 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovnkube-config\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757530 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/40fbccdd-dab4-458a-93d4-a60daf555bc6-tmp\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.757544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757554 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-system-cni-dir\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757583 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-os-release\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757592 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757608 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-systemd-units\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757631 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-systemd\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757657 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-kubelet\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757684 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-etc-kubernetes\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757695 2560 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757708 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/81cb0b16-9454-40e0-907f-db4de6741a0c-tmp-dir\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757735 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysctl-d\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757775 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757796 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-slash\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757817 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mb7n\" (UniqueName: \"kubernetes.io/projected/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-kube-api-access-6mb7n\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757837 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-cni-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757862 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-cni-binary-copy\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757903 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-cni-multus\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757933 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757958 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-registration-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.758222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757960 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-iptables-alerter-script\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757978 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-node-log\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.757988 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-device-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758024 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc7d6449-50b3-4246-90df-a37a0edc66d9-konnectivity-ca\") pod \"konnectivity-agent-4frh7\" (UID: \"fc7d6449-50b3-4246-90df-a37a0edc66d9\") " pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758041 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-device-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758065 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfgd5\" (UniqueName: \"kubernetes.io/projected/81cb0b16-9454-40e0-907f-db4de6741a0c-kube-api-access-hfgd5\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758097 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovnkube-config\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758100 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-slash\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758118 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysctl-d\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758101 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/11442be0-1bee-4a8d-8374-44ea164b6268-serviceca\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758156 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysconfig\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758180 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysctl-conf\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758207 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-run\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758231 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-run-netns\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758255 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovn-node-metrics-cert\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758279 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-os-release\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758303 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11442be0-1bee-4a8d-8374-44ea164b6268-host\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758326 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6rn\" (UniqueName: \"kubernetes.io/projected/f45f69f8-87a8-49f9-bb8b-485368427802-kube-api-access-9f6rn\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:47.759016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758354 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z572t\" (UniqueName: \"kubernetes.io/projected/40fbccdd-dab4-458a-93d4-a60daf555bc6-kube-api-access-z572t\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758360 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-cni-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758420 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758495 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysconfig\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758533 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-cni-binary-copy\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758557 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-host\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758692 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-sysctl-conf\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758762 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-run-netns\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758770 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-os-release\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758787 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-run\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758796 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11442be0-1bee-4a8d-8374-44ea164b6268-host\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758811 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-cni-multus\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758849 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-systemd\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758852 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-system-cni-dir\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758855 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758859 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-kubelet\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758899 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-etc-kubernetes\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758919 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-systemd-units\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.759939 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758943 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-os-release\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758970 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-registration-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.758991 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759024 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-kubelet\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759048 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-var-lib-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759073 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-system-cni-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759112 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-socket-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759123 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-system-cni-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759158 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-cnibin\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759172 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-var-lib-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759185 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-cni-bin\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759210 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-env-overrides\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759214 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-kubelet\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759239 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-multus-certs\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759272 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhz2\" (UniqueName: \"kubernetes.io/projected/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-kube-api-access-vjhz2\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759296 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc7d6449-50b3-4246-90df-a37a0edc66d9-agent-certs\") pod \"konnectivity-agent-4frh7\" (UID: \"fc7d6449-50b3-4246-90df-a37a0edc66d9\") " pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759300 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/11442be0-1bee-4a8d-8374-44ea164b6268-serviceca\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.760729 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759315 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-socket-dir\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759374 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-cni-bin\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759318 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.759403 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759418 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-host-slash\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759442 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.759461 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.259441025 +0000 UTC m=+3.067694168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759478 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759489 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-cni-netd\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759515 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-host-slash\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759517 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-k8s-cni-cncf-io\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759527 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759547 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-conf-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759551 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-k8s-cni-cncf-io\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759570 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-sys\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759597 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-cni-bin\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759621 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-sys-fs\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.761544 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759644 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/81cb0b16-9454-40e0-907f-db4de6741a0c-hosts-file\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759669 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-lib-modules\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759691 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-multus-certs\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759704 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92jp7\" (UniqueName: \"kubernetes.io/projected/e68012e7-cccf-4da0-86e1-1355b80e2784-kube-api-access-92jp7\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759733 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-cnibin\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759757 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-socket-dir-parent\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759782 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-daemon-config\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759805 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc967\" (UniqueName: \"kubernetes.io/projected/11442be0-1bee-4a8d-8374-44ea164b6268-kube-api-access-cc967\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759818 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/81cb0b16-9454-40e0-907f-db4de6741a0c-hosts-file\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759827 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-systemd\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759849 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759864 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-conf-dir\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759892 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovnkube-script-lib\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759927 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-kubernetes\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759945 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/81cb0b16-9454-40e0-907f-db4de6741a0c-tmp-dir\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759952 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-var-lib-kubelet\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.759991 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-var-lib-kubelet\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760024 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx7tk\" (UniqueName: \"kubernetes.io/projected/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-kube-api-access-sx7tk\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760031 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-var-lib-cni-bin\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760076 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-ovn\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760088 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-env-overrides\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760103 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-log-socket\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760112 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-kubernetes\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760077 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5fbb596d-4520-40e0-a72b-a223546a6d8f-sys-fs\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760153 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-ovn\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760168 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-log-socket\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760186 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4cf5\" (UniqueName: \"kubernetes.io/projected/5fbb596d-4520-40e0-a72b-a223546a6d8f-kube-api-access-b4cf5\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760208 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-host-cni-netd\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760213 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760239 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-etc-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760264 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-netns\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760290 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-modprobe-d\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760312 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-lib-modules\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760342 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-sys\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760371 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-run-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.762819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760381 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-cnibin\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760389 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-etc-openvswitch\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760409 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-host-run-netns\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760436 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-socket-dir-parent\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760456 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-modprobe-d\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760630 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-systemd\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760905 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovnkube-script-lib\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760936 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-multus-daemon-config\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.760993 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e68012e7-cccf-4da0-86e1-1355b80e2784-cnibin\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.761176 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e68012e7-cccf-4da0-86e1-1355b80e2784-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.761214 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-cni-binary-copy\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.761239 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/40fbccdd-dab4-458a-93d4-a60daf555bc6-tmp\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.761327 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/40fbccdd-dab4-458a-93d4-a60daf555bc6-etc-tuned\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.761759 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-ovn-node-metrics-cert\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.763344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.762626 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc7d6449-50b3-4246-90df-a37a0edc66d9-agent-certs\") pod \"konnectivity-agent-4frh7\" (UID: \"fc7d6449-50b3-4246-90df-a37a0edc66d9\") " pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.769280 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.769260 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfgd5\" (UniqueName: \"kubernetes.io/projected/81cb0b16-9454-40e0-907f-db4de6741a0c-kube-api-access-hfgd5\") pod \"node-resolver-p58zd\" (UID: \"81cb0b16-9454-40e0-907f-db4de6741a0c\") " pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.771471 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.771449 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:47.771576 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.771474 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:47.771576 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.771488 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:47.771576 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:47.771563 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.271543977 +0000 UTC m=+3.079797130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:47.772640 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.772619 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mb7n\" (UniqueName: \"kubernetes.io/projected/703a2cee-8a1d-4b57-b34b-e3d59b2bc18a-kube-api-access-6mb7n\") pod \"ovnkube-node-5ggql\" (UID: \"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:47.773336 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.773317 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6rn\" (UniqueName: \"kubernetes.io/projected/f45f69f8-87a8-49f9-bb8b-485368427802-kube-api-access-9f6rn\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:47.773522 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.773502 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z572t\" (UniqueName: \"kubernetes.io/projected/40fbccdd-dab4-458a-93d4-a60daf555bc6-kube-api-access-z572t\") pod \"tuned-7zwqh\" (UID: \"40fbccdd-dab4-458a-93d4-a60daf555bc6\") " pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.780252 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.780223 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4cf5\" (UniqueName: \"kubernetes.io/projected/5fbb596d-4520-40e0-a72b-a223546a6d8f-kube-api-access-b4cf5\") pod \"aws-ebs-csi-driver-node-zl8hx\" (UID: \"5fbb596d-4520-40e0-a72b-a223546a6d8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.783241 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.783205 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx7tk\" (UniqueName: \"kubernetes.io/projected/29af9e08-7c9a-47dc-a8e2-f5af3aee18db-kube-api-access-sx7tk\") pod \"iptables-alerter-249mr\" (UID: \"29af9e08-7c9a-47dc-a8e2-f5af3aee18db\") " pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.783566 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.783537 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jp7\" (UniqueName: \"kubernetes.io/projected/e68012e7-cccf-4da0-86e1-1355b80e2784-kube-api-access-92jp7\") pod \"multus-additional-cni-plugins-9sv6h\" (UID: \"e68012e7-cccf-4da0-86e1-1355b80e2784\") " pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.784845 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.784816 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc967\" (UniqueName: \"kubernetes.io/projected/11442be0-1bee-4a8d-8374-44ea164b6268-kube-api-access-cc967\") pod \"node-ca-bn8st\" (UID: \"11442be0-1bee-4a8d-8374-44ea164b6268\") " pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.785051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.785036 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhz2\" (UniqueName: \"kubernetes.io/projected/2b0c884f-a52c-4cb2-8c9e-b1036ea24b12-kube-api-access-vjhz2\") pod \"multus-4gzw4\" (UID: \"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12\") " pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.941500 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.941409 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4gzw4" Apr 16 22:13:47.951320 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.951294 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" Apr 16 22:13:47.956473 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.956455 2560 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:47.959647 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.959626 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p58zd" Apr 16 22:13:47.964924 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.964905 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bn8st" Apr 16 22:13:47.970413 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.970396 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:13:47.976954 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.976934 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" Apr 16 22:13:47.984498 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.984471 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" Apr 16 22:13:47.992018 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.991995 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-249mr" Apr 16 22:13:47.997597 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:47.997580 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:13:48.103296 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.103264 2560 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:48.263685 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.263603 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:48.263850 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.263743 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:48.263850 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.263811 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.263793244 +0000 UTC m=+4.072046391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:48.317155 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.317125 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11442be0_1bee_4a8d_8374_44ea164b6268.slice/crio-2b6df5888d4bd9a03b768757dd874a627d71ebb82b0370a4b92d0a236cc1cc87 WatchSource:0}: Error finding container 2b6df5888d4bd9a03b768757dd874a627d71ebb82b0370a4b92d0a236cc1cc87: Status 404 returned error can't find the container with id 2b6df5888d4bd9a03b768757dd874a627d71ebb82b0370a4b92d0a236cc1cc87 Apr 16 22:13:48.319311 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.319274 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81cb0b16_9454_40e0_907f_db4de6741a0c.slice/crio-2fa117524b1b99b881d15318f747679459503f2ed695cfa1d8897d1c0c1a3e50 WatchSource:0}: Error finding container 2fa117524b1b99b881d15318f747679459503f2ed695cfa1d8897d1c0c1a3e50: Status 404 returned error can't find the container with id 2fa117524b1b99b881d15318f747679459503f2ed695cfa1d8897d1c0c1a3e50 Apr 16 22:13:48.322107 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.322050 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29af9e08_7c9a_47dc_a8e2_f5af3aee18db.slice/crio-18f95fcf3b5aec87a5a0e5be02b4dba11ca28fcee22f30610f70f5885f2f44a2 WatchSource:0}: Error finding container 18f95fcf3b5aec87a5a0e5be02b4dba11ca28fcee22f30610f70f5885f2f44a2: Status 404 returned error can't find the container with id 18f95fcf3b5aec87a5a0e5be02b4dba11ca28fcee22f30610f70f5885f2f44a2 Apr 16 22:13:48.322820 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.322756 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b0c884f_a52c_4cb2_8c9e_b1036ea24b12.slice/crio-997d04073b946080649e4cec6327455411ae70e96b28cd0451d5659341e89b4f WatchSource:0}: Error finding container 997d04073b946080649e4cec6327455411ae70e96b28cd0451d5659341e89b4f: Status 404 returned error can't find the container with id 997d04073b946080649e4cec6327455411ae70e96b28cd0451d5659341e89b4f Apr 16 22:13:48.323919 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.323808 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68012e7_cccf_4da0_86e1_1355b80e2784.slice/crio-703dc823b39b05988c2d4b5516fbf0f307e4a16b417c08c311c357c9ec5c944c WatchSource:0}: Error finding container 703dc823b39b05988c2d4b5516fbf0f307e4a16b417c08c311c357c9ec5c944c: Status 404 returned error can't find the container with id 703dc823b39b05988c2d4b5516fbf0f307e4a16b417c08c311c357c9ec5c944c Apr 16 22:13:48.325035 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.324531 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7d6449_50b3_4246_90df_a37a0edc66d9.slice/crio-67c683c36d5b423d0f9b23d6413255a20be8fdc7aea7afc4a3d1486c4b3c6158 WatchSource:0}: Error finding container 67c683c36d5b423d0f9b23d6413255a20be8fdc7aea7afc4a3d1486c4b3c6158: Status 404 returned error can't find the container with id 67c683c36d5b423d0f9b23d6413255a20be8fdc7aea7afc4a3d1486c4b3c6158 Apr 16 22:13:48.325676 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:13:48.325585 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40fbccdd_dab4_458a_93d4_a60daf555bc6.slice/crio-3e003abee821930d9034a4fc20ea4406494a1d3b57f1378f2a92df42dae77bbe WatchSource:0}: Error finding container 3e003abee821930d9034a4fc20ea4406494a1d3b57f1378f2a92df42dae77bbe: Status 404 returned error can't find the container with id 3e003abee821930d9034a4fc20ea4406494a1d3b57f1378f2a92df42dae77bbe Apr 16 22:13:48.364476 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.364292 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:48.364576 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.364444 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:48.364576 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.364575 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:48.364669 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.364589 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:48.364669 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.364634 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.364619729 +0000 UTC m=+4.172872860 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:48.691417 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.691369 2560 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:46 +0000 UTC" deadline="2027-12-09 10:44:59.842001731 +0000 UTC" Apr 16 22:13:48.691417 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.691413 2560 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14436h31m11.150592713s" Apr 16 22:13:48.774380 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.773863 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:48.774380 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:48.774010 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:48.783356 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.783318 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" event={"ID":"40fbccdd-dab4-458a-93d4-a60daf555bc6","Type":"ContainerStarted","Data":"3e003abee821930d9034a4fc20ea4406494a1d3b57f1378f2a92df42dae77bbe"} Apr 16 22:13:48.789999 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.789924 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerStarted","Data":"703dc823b39b05988c2d4b5516fbf0f307e4a16b417c08c311c357c9ec5c944c"} Apr 16 22:13:48.797342 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.797272 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p58zd" event={"ID":"81cb0b16-9454-40e0-907f-db4de6741a0c","Type":"ContainerStarted","Data":"2fa117524b1b99b881d15318f747679459503f2ed695cfa1d8897d1c0c1a3e50"} Apr 16 22:13:48.800808 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.799985 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" event={"ID":"e12fe8cf6bc72ee22e06a7c9a61455c5","Type":"ContainerStarted","Data":"0e9c81b99848a8ebb7876ae8aade9d347fb0956953165bdd95d659ce3ffc5535"} Apr 16 22:13:48.808818 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.808790 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"3a958752642b8b63ab989e32c7f0b7d2dbe7be5804795b4cb9cd5277a5d453c8"} Apr 16 22:13:48.817570 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.817353 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4frh7" event={"ID":"fc7d6449-50b3-4246-90df-a37a0edc66d9","Type":"ContainerStarted","Data":"67c683c36d5b423d0f9b23d6413255a20be8fdc7aea7afc4a3d1486c4b3c6158"} Apr 16 22:13:48.820297 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.820266 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-249mr" event={"ID":"29af9e08-7c9a-47dc-a8e2-f5af3aee18db","Type":"ContainerStarted","Data":"18f95fcf3b5aec87a5a0e5be02b4dba11ca28fcee22f30610f70f5885f2f44a2"} Apr 16 22:13:48.825214 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.825187 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bn8st" event={"ID":"11442be0-1bee-4a8d-8374-44ea164b6268","Type":"ContainerStarted","Data":"2b6df5888d4bd9a03b768757dd874a627d71ebb82b0370a4b92d0a236cc1cc87"} Apr 16 22:13:48.844930 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.838351 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gzw4" event={"ID":"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12","Type":"ContainerStarted","Data":"997d04073b946080649e4cec6327455411ae70e96b28cd0451d5659341e89b4f"} Apr 16 22:13:48.844930 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:48.843589 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" event={"ID":"5fbb596d-4520-40e0-a72b-a223546a6d8f","Type":"ContainerStarted","Data":"005a03cda283e031c98471339fcf7f83e1b46d43817a1a6d130afe5030d105f7"} Apr 16 22:13:49.274129 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.274097 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:49.274289 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.274264 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.274354 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.274336 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:51.274314213 +0000 UTC m=+6.082567356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.375685 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.375089 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:49.375685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.375247 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:49.375685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.375266 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:49.375685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.375278 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.375685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.375339 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:51.375320849 +0000 UTC m=+6.183573986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.654460 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.654105 2560 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:49.775315 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.774795 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:49.775315 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:49.774946 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:49.854595 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.853496 2560 generic.go:358] "Generic (PLEG): container finished" podID="e7ebb63d89fc306c77bf2d82a3bc33b8" containerID="b8ae12fd1ed0ef296cab304b29fee4afc950e757c5c6ba9a06b40bcd9b5c3728" exitCode=0 Apr 16 22:13:49.854595 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.854526 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" event={"ID":"e7ebb63d89fc306c77bf2d82a3bc33b8","Type":"ContainerDied","Data":"b8ae12fd1ed0ef296cab304b29fee4afc950e757c5c6ba9a06b40bcd9b5c3728"} Apr 16 22:13:49.868924 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:49.868433 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-35.ec2.internal" podStartSLOduration=2.8684138470000002 podStartE2EDuration="2.868413847s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:48.814895951 +0000 UTC m=+3.623149100" watchObservedRunningTime="2026-04-16 22:13:49.868413847 +0000 UTC m=+4.676667003" Apr 16 22:13:50.237676 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.236824 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9lc9p"] Apr 16 22:13:50.239518 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.238930 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.239518 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:50.239028 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:13:50.283381 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.283332 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/816c1425-a29e-4e5e-b5ae-ad9b214b5349-dbus\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.283634 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.283392 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/816c1425-a29e-4e5e-b5ae-ad9b214b5349-kubelet-config\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.283634 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.283456 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.384095 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.384004 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/816c1425-a29e-4e5e-b5ae-ad9b214b5349-dbus\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.384095 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.384064 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/816c1425-a29e-4e5e-b5ae-ad9b214b5349-kubelet-config\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.384305 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.384132 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.384305 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:50.384284 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:50.384410 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:50.384350 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret podName:816c1425-a29e-4e5e-b5ae-ad9b214b5349 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:50.884328642 +0000 UTC m=+5.692581774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret") pod "global-pull-secret-syncer-9lc9p" (UID: "816c1425-a29e-4e5e-b5ae-ad9b214b5349") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:50.384717 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.384694 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/816c1425-a29e-4e5e-b5ae-ad9b214b5349-dbus\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.384816 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.384766 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/816c1425-a29e-4e5e-b5ae-ad9b214b5349-kubelet-config\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.772579 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.772496 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:50.772769 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:50.772656 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:50.866664 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.866627 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" event={"ID":"e7ebb63d89fc306c77bf2d82a3bc33b8","Type":"ContainerStarted","Data":"4300f0847f9d399274cea37bedccdeda92d10c004ced6373267980b791e92557"} Apr 16 22:13:50.888830 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:50.888791 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:50.889017 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:50.888991 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:50.889077 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:50.889049 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret podName:816c1425-a29e-4e5e-b5ae-ad9b214b5349 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:51.889030985 +0000 UTC m=+6.697284122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret") pod "global-pull-secret-syncer-9lc9p" (UID: "816c1425-a29e-4e5e-b5ae-ad9b214b5349") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:51.302604 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:51.302522 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:51.302775 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.302686 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:51.302775 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.302761 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:55.302741928 +0000 UTC m=+10.110995074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:51.403530 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:51.403486 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:51.403691 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.403671 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:51.403768 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.403693 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:51.403768 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.403708 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:51.403885 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.403773 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:55.403753888 +0000 UTC m=+10.212007022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:51.773340 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:51.773308 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:51.773518 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.773434 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:51.773891 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:51.773830 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:51.773984 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.773943 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:13:51.907490 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:51.907329 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:51.907974 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.907506 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:51.907974 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:51.907568 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret podName:816c1425-a29e-4e5e-b5ae-ad9b214b5349 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:53.907550462 +0000 UTC m=+8.715803595 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret") pod "global-pull-secret-syncer-9lc9p" (UID: "816c1425-a29e-4e5e-b5ae-ad9b214b5349") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:52.772627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:52.772591 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:52.772825 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:52.772741 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:53.773500 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:53.773412 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:53.773973 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:53.773530 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:13:53.773973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:53.773584 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:53.773973 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:53.773659 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:53.925474 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:53.925433 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:53.925645 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:53.925625 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:53.925716 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:53.925705 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret podName:816c1425-a29e-4e5e-b5ae-ad9b214b5349 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:57.925684398 +0000 UTC m=+12.733937531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret") pod "global-pull-secret-syncer-9lc9p" (UID: "816c1425-a29e-4e5e-b5ae-ad9b214b5349") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:54.773379 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:54.773263 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:54.773542 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:54.773386 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:55.338414 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:55.338205 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:55.338414 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.338408 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:55.338657 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.338471 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.338454457 +0000 UTC m=+18.146707607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:55.438772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:55.438737 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:55.438977 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.438960 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:55.439064 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.438985 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:55.439064 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.439000 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:55.439064 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.439063 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.439044525 +0000 UTC m=+18.247297675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:55.774060 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:55.773977 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:55.774480 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.774091 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:55.774542 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:55.774501 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:55.774629 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:55.774590 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:13:56.772614 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:56.772573 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:56.772816 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:56.772739 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:57.772959 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:57.772928 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:57.773432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:57.772940 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:57.773432 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:57.773047 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:57.773432 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:57.773128 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:13:57.959090 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:57.959055 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:57.959263 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:57.959234 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:57.959321 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:57.959308 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret podName:816c1425-a29e-4e5e-b5ae-ad9b214b5349 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.959284761 +0000 UTC m=+20.767537898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret") pod "global-pull-secret-syncer-9lc9p" (UID: "816c1425-a29e-4e5e-b5ae-ad9b214b5349") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:58.772714 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:58.772666 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:13:58.772908 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:58.772826 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:13:59.772801 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:59.772762 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:13:59.773318 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:13:59.772810 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:13:59.773318 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:59.772896 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:13:59.773318 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:13:59.773019 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:00.773153 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:00.773111 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:00.773617 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:00.773256 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:01.772629 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:01.772590 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:01.772838 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:01.772646 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:01.772838 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:01.772793 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:01.772942 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:01.772897 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:02.773165 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:02.773126 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:02.773616 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:02.773254 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:03.396660 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:03.396624 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:03.396847 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.396811 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:03.396925 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.396899 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.396862889 +0000 UTC m=+34.205116037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:03.497586 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:03.497542 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:03.497763 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.497731 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:03.497763 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.497756 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:03.497913 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.497770 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:03.497913 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.497889 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.497854381 +0000 UTC m=+34.306107533 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:03.775347 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:03.775271 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:03.775799 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:03.775273 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:03.775799 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.775379 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:03.775799 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:03.775467 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:04.773327 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:04.773257 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:04.773468 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:04.773370 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:05.773987 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.773667 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:05.774636 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.773737 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:05.774636 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:05.774076 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:05.774636 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:05.774169 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:05.893650 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.893621 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" event={"ID":"40fbccdd-dab4-458a-93d4-a60daf555bc6","Type":"ContainerStarted","Data":"4ff93dd7b095d48a3200f9aa12acc8e447eed37b0b2eb504f8ae30cd6fc9b893"} Apr 16 22:14:05.894999 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.894974 2560 generic.go:358] "Generic (PLEG): container finished" podID="e68012e7-cccf-4da0-86e1-1355b80e2784" containerID="609a871862fa262d315da01ea832dab0b2995b6b1d32995b394a95e7a81b0aca" exitCode=0 Apr 16 22:14:05.895107 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.895029 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerDied","Data":"609a871862fa262d315da01ea832dab0b2995b6b1d32995b394a95e7a81b0aca"} Apr 16 22:14:05.896318 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.896211 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p58zd" event={"ID":"81cb0b16-9454-40e0-907f-db4de6741a0c","Type":"ContainerStarted","Data":"55d0914a2bdf8a8dd43cf193c676f959ae3a81abf1a10ad3348b48b3830ff50b"} Apr 16 22:14:05.898672 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.898653 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:14:05.898962 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.898945 2560 generic.go:358] "Generic (PLEG): container finished" podID="703a2cee-8a1d-4b57-b34b-e3d59b2bc18a" containerID="24fb41c82b7fa795142a440899bd9ac21b067b571058d46d04290d4d17a78dd9" exitCode=1 Apr 16 22:14:05.899031 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.899003 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"f1ed1c4b3a7b189e83e86cdf7ea73a8e7c57ea7a4e04ccb3b25828e98b94a80e"} Apr 16 22:14:05.899031 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.899025 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"569231fa7b1a54a17a74a41cd54aefcfe3e32260dff69274fc92971443a8a00d"} Apr 16 22:14:05.899110 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.899035 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"e47d6113ed0a1fb9bec7c642b91fb577f0d919155159adc9c611f46571fcc036"} Apr 16 22:14:05.899110 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.899043 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"d2bf66a34966a153ce965a31b1524fefb1c059483948b5c6b00f4bdc891cac00"} Apr 16 22:14:05.899110 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.899051 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerDied","Data":"24fb41c82b7fa795142a440899bd9ac21b067b571058d46d04290d4d17a78dd9"} Apr 16 22:14:05.899110 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.899061 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"a38a5f3985c65800f0db828173923676fe516f26654d2073fba98a16b60aad0b"} Apr 16 22:14:05.900145 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.900126 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4frh7" event={"ID":"fc7d6449-50b3-4246-90df-a37a0edc66d9","Type":"ContainerStarted","Data":"def4dbb5c4545e3517815500a7955a059a60735fd01f41738dc6a4d31ffacf00"} Apr 16 22:14:05.901633 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.901615 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bn8st" event={"ID":"11442be0-1bee-4a8d-8374-44ea164b6268","Type":"ContainerStarted","Data":"fdfeb8e2c835b31e8feb35c8360223f68306147fed257190aa9b74088292df11"} Apr 16 22:14:05.905320 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.905297 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gzw4" event={"ID":"2b0c884f-a52c-4cb2-8c9e-b1036ea24b12","Type":"ContainerStarted","Data":"97907316c212994d3ffa73fbd0bb14bbb016d22c45419fd2f34f5232bd2751e3"} Apr 16 22:14:05.906357 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.906339 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" event={"ID":"5fbb596d-4520-40e0-a72b-a223546a6d8f","Type":"ContainerStarted","Data":"38a322b93a86f271e83f9606d468a2d053897dbc45d2ee06b44d2201ee2371f6"} Apr 16 22:14:05.912011 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.911967 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-35.ec2.internal" podStartSLOduration=18.911952162 podStartE2EDuration="18.911952162s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:50.882281263 +0000 UTC m=+5.690534416" watchObservedRunningTime="2026-04-16 22:14:05.911952162 +0000 UTC m=+20.720205316" Apr 16 22:14:05.912330 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.912298 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7zwqh" podStartSLOduration=4.21594639 podStartE2EDuration="20.912290475s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.327592819 +0000 UTC m=+3.135845967" lastFinishedPulling="2026-04-16 22:14:05.023936916 +0000 UTC m=+19.832190052" observedRunningTime="2026-04-16 22:14:05.911471839 +0000 UTC m=+20.719725016" watchObservedRunningTime="2026-04-16 22:14:05.912290475 +0000 UTC m=+20.720543651" Apr 16 22:14:05.927028 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.926994 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4gzw4" podStartSLOduration=4.182189011 podStartE2EDuration="20.926983051s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.325086642 +0000 UTC m=+3.133339771" lastFinishedPulling="2026-04-16 22:14:05.069880666 +0000 UTC m=+19.878133811" observedRunningTime="2026-04-16 22:14:05.926927309 +0000 UTC m=+20.735180461" watchObservedRunningTime="2026-04-16 22:14:05.926983051 +0000 UTC m=+20.735236203" Apr 16 22:14:05.969367 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:05.969315 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p58zd" podStartSLOduration=4.249426789 podStartE2EDuration="20.969300447s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.321611365 +0000 UTC m=+3.129864508" lastFinishedPulling="2026-04-16 22:14:05.041485027 +0000 UTC m=+19.849738166" observedRunningTime="2026-04-16 22:14:05.96861264 +0000 UTC m=+20.776865998" watchObservedRunningTime="2026-04-16 22:14:05.969300447 +0000 UTC m=+20.777553599" Apr 16 22:14:06.004128 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.004071 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bn8st" podStartSLOduration=4.299083857 podStartE2EDuration="21.004055508s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.318954129 +0000 UTC m=+3.127207262" lastFinishedPulling="2026-04-16 22:14:05.023925766 +0000 UTC m=+19.832178913" observedRunningTime="2026-04-16 22:14:05.984829604 +0000 UTC m=+20.793082753" watchObservedRunningTime="2026-04-16 22:14:06.004055508 +0000 UTC m=+20.812308665" Apr 16 22:14:06.004740 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.004710 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4frh7" podStartSLOduration=4.307853659 podStartE2EDuration="21.004701837s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.327076638 +0000 UTC m=+3.135329769" lastFinishedPulling="2026-04-16 22:14:05.023924806 +0000 UTC m=+19.832177947" observedRunningTime="2026-04-16 22:14:06.004007627 +0000 UTC m=+20.812260781" watchObservedRunningTime="2026-04-16 22:14:06.004701837 +0000 UTC m=+20.812954988" Apr 16 22:14:06.016337 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.016309 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:06.017201 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:06.016492 2560 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:06.017201 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:06.016566 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret podName:816c1425-a29e-4e5e-b5ae-ad9b214b5349 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:22.016549919 +0000 UTC m=+36.824803068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret") pod "global-pull-secret-syncer-9lc9p" (UID: "816c1425-a29e-4e5e-b5ae-ad9b214b5349") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:06.442367 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.442344 2560 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:14:06.709688 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.709532 2560 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:14:06.442361811Z","UUID":"aa810bbb-6a87-42be-a8f1-bb54939986d3","Handler":null,"Name":"","Endpoint":""} Apr 16 22:14:06.711318 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.711294 2560 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:14:06.711425 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.711327 2560 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:14:06.772432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.772406 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:06.772582 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:06.772520 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:06.910198 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.910159 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-249mr" event={"ID":"29af9e08-7c9a-47dc-a8e2-f5af3aee18db","Type":"ContainerStarted","Data":"cd64632ece895113a202066e31ef2b9e01f729e2fac2f6c6a4fc4a0aec57aeae"} Apr 16 22:14:06.912268 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.912243 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" event={"ID":"5fbb596d-4520-40e0-a72b-a223546a6d8f","Type":"ContainerStarted","Data":"c7ef8f26524d39b8ee8409f5807ba8b7adb194a79c3c7f6061d132edeb0ff39a"} Apr 16 22:14:06.925348 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:06.925297 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-249mr" podStartSLOduration=5.225087186 podStartE2EDuration="21.925282165s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.32372872 +0000 UTC m=+3.131981850" lastFinishedPulling="2026-04-16 22:14:05.023923692 +0000 UTC m=+19.832176829" observedRunningTime="2026-04-16 22:14:06.925229259 +0000 UTC m=+21.733482412" watchObservedRunningTime="2026-04-16 22:14:06.925282165 +0000 UTC m=+21.733535319" Apr 16 22:14:07.772464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:07.772429 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:07.772741 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:07.772556 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:07.772741 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:07.772590 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:07.772741 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:07.772714 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:07.915806 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:07.915766 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" event={"ID":"5fbb596d-4520-40e0-a72b-a223546a6d8f","Type":"ContainerStarted","Data":"0c44f0a57ea566c0691a20e28720fe83138635cd1f6d8da5dc415ed9f20c0c4f"} Apr 16 22:14:08.773436 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:08.773221 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:08.773598 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:08.773538 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:08.920749 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:08.920719 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:14:08.921229 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:08.921114 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"62eb0a201138608d30ceeb95b156673c8f2517738672db94b64f76c15c3f5c64"} Apr 16 22:14:09.773318 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:09.773280 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:09.773518 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:09.773437 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:09.773518 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:09.773490 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:09.773638 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:09.773612 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:10.293636 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.293611 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:14:10.294260 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.294241 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:14:10.312007 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.311967 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zl8hx" podStartSLOduration=6.113577968 podStartE2EDuration="25.31195336s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.329026161 +0000 UTC m=+3.137279305" lastFinishedPulling="2026-04-16 22:14:07.527401551 +0000 UTC m=+22.335654697" observedRunningTime="2026-04-16 22:14:07.938144848 +0000 UTC m=+22.746398001" watchObservedRunningTime="2026-04-16 22:14:10.31195336 +0000 UTC m=+25.120206512" Apr 16 22:14:10.459831 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.459658 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:14:10.460117 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.460093 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4frh7" Apr 16 22:14:10.772721 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.772627 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:10.772896 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:10.772740 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:10.925862 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.925826 2560 generic.go:358] "Generic (PLEG): container finished" podID="e68012e7-cccf-4da0-86e1-1355b80e2784" containerID="aeb35e5d26f65065d278cb586dc76ae9f43646d1b06981e935db77d02578f3a0" exitCode=0 Apr 16 22:14:10.926033 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.925925 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerDied","Data":"aeb35e5d26f65065d278cb586dc76ae9f43646d1b06981e935db77d02578f3a0"} Apr 16 22:14:10.928805 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.928787 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:14:10.929175 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.929154 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"6dac41bafc69da1c4712314d29e76523c05895759df6d608a2eba1a188907979"} Apr 16 22:14:10.929464 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.929448 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:14:10.929539 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.929474 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:14:10.929686 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.929606 2560 scope.go:117] "RemoveContainer" containerID="24fb41c82b7fa795142a440899bd9ac21b067b571058d46d04290d4d17a78dd9" Apr 16 22:14:10.944307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:10.944281 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:14:11.773432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.773401 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:11.773785 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.773445 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:11.773785 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:11.773529 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:11.773785 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:11.773642 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:11.933961 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.933940 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:14:11.934283 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.934257 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" event={"ID":"703a2cee-8a1d-4b57-b34b-e3d59b2bc18a","Type":"ContainerStarted","Data":"bd308e2bdd5315a2cce77ef892759368f993257dc95d98007d00d21465a23b1a"} Apr 16 22:14:11.934685 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.934649 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:14:11.949028 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.949008 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:14:11.964701 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:11.964654 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" podStartSLOduration=10.20116402 podStartE2EDuration="26.964640694s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.330732245 +0000 UTC m=+3.138985376" lastFinishedPulling="2026-04-16 22:14:05.09420892 +0000 UTC m=+19.902462050" observedRunningTime="2026-04-16 22:14:11.964554952 +0000 UTC m=+26.772808104" watchObservedRunningTime="2026-04-16 22:14:11.964640694 +0000 UTC m=+26.772893846" Apr 16 22:14:12.127849 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.127821 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qfmn8"] Apr 16 22:14:12.127989 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.127972 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:12.128097 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:12.128077 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:12.130904 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.130883 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cph62"] Apr 16 22:14:12.131006 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.130994 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:12.131116 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:12.131096 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:12.131466 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.131431 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9lc9p"] Apr 16 22:14:12.131548 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.131537 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:12.131735 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:12.131627 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:12.937960 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.937773 2560 generic.go:358] "Generic (PLEG): container finished" podID="e68012e7-cccf-4da0-86e1-1355b80e2784" containerID="72cfd52e06261a78b35870de4ed922fc39ce6182066f2c9e230d1dd440fa61c6" exitCode=0 Apr 16 22:14:12.938340 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:12.937850 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerDied","Data":"72cfd52e06261a78b35870de4ed922fc39ce6182066f2c9e230d1dd440fa61c6"} Apr 16 22:14:13.772632 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:13.772600 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:13.772632 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:13.772630 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:13.772819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:13.772642 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:13.772819 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:13.772740 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:13.772901 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:13.772855 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:13.772949 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:13.772933 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:14.944393 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:14.944360 2560 generic.go:358] "Generic (PLEG): container finished" podID="e68012e7-cccf-4da0-86e1-1355b80e2784" containerID="2b6ddda90a153e9288e7f62ae4e21e2885f6419adff7a2bda0905ffd56069416" exitCode=0 Apr 16 22:14:14.944838 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:14.944419 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerDied","Data":"2b6ddda90a153e9288e7f62ae4e21e2885f6419adff7a2bda0905ffd56069416"} Apr 16 22:14:15.773595 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:15.773567 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:15.773755 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:15.773646 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:15.773755 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:15.773683 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:15.773755 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:15.773709 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:15.773755 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:15.773741 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:15.773978 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:15.773789 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:17.773045 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:17.773015 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:17.773685 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:17.773015 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:17.773685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:17.773155 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:14:17.773685 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:17.773015 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:17.773685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:17.773227 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfmn8" podUID="fac69899-e7fe-4a77-b0b3-504c9f451bdf" Apr 16 22:14:17.773685 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:17.773295 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9lc9p" podUID="816c1425-a29e-4e5e-b5ae-ad9b214b5349" Apr 16 22:14:18.043753 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.043676 2560 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-35.ec2.internal" event="NodeReady" Apr 16 22:14:18.043915 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.043804 2560 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:18.095495 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.095461 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rwmdq"] Apr 16 22:14:18.117115 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.117074 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dnq97"] Apr 16 22:14:18.117265 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.117184 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.120152 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.120117 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fk2j6\"" Apr 16 22:14:18.120274 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.120164 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:18.120331 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.120120 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:18.141442 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.141417 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dnq97"] Apr 16 22:14:18.141574 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.141450 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rwmdq"] Apr 16 22:14:18.141574 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.141560 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.143977 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.143956 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:18.144303 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.144287 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:18.144474 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.144454 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-86pmr\"" Apr 16 22:14:18.144658 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.144642 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:18.208699 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.208676 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zjw\" (UniqueName: \"kubernetes.io/projected/d49b12b0-4e0c-4cb5-baba-dda53600ba56-kube-api-access-r8zjw\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.208819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.208724 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d49b12b0-4e0c-4cb5-baba-dda53600ba56-config-volume\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.208819 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.208751 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.208914 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.208824 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d49b12b0-4e0c-4cb5-baba-dda53600ba56-tmp-dir\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.310146 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310071 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7gp\" (UniqueName: \"kubernetes.io/projected/47261062-7dc9-439a-ab78-089432ccd885-kube-api-access-9z7gp\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.310288 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310157 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d49b12b0-4e0c-4cb5-baba-dda53600ba56-config-volume\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.310288 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310203 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.310288 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310249 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.310288 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310277 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d49b12b0-4e0c-4cb5-baba-dda53600ba56-tmp-dir\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.310485 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310351 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zjw\" (UniqueName: \"kubernetes.io/projected/d49b12b0-4e0c-4cb5-baba-dda53600ba56-kube-api-access-r8zjw\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.310485 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.310364 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:18.310575 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.310536 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:18.81051462 +0000 UTC m=+33.618767762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:18.310655 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310623 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d49b12b0-4e0c-4cb5-baba-dda53600ba56-tmp-dir\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.310824 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.310804 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d49b12b0-4e0c-4cb5-baba-dda53600ba56-config-volume\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.323154 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.323129 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zjw\" (UniqueName: \"kubernetes.io/projected/d49b12b0-4e0c-4cb5-baba-dda53600ba56-kube-api-access-r8zjw\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.410950 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.410897 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7gp\" (UniqueName: \"kubernetes.io/projected/47261062-7dc9-439a-ab78-089432ccd885-kube-api-access-9z7gp\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.411151 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.410988 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.411151 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.411103 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:18.411272 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.411168 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:18.9111488 +0000 UTC m=+33.719401929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:18.421619 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.421587 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7gp\" (UniqueName: \"kubernetes.io/projected/47261062-7dc9-439a-ab78-089432ccd885-kube-api-access-9z7gp\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.813787 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.813744 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:18.814316 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.813951 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:18.814316 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.814029 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.814008132 +0000 UTC m=+34.622261276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:18.915182 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:18.915147 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:18.915379 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.915315 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:18.915448 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:18.915416 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.915381453 +0000 UTC m=+34.723634585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:19.419088 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.419042 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:19.419338 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.419217 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:19.419338 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.419310 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:51.419289718 +0000 UTC m=+66.227542869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:19.520132 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.520091 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:19.520308 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.520269 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:19.520308 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.520296 2560 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:19.520397 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.520311 2560 projected.go:194] Error preparing data for projected volume kube-api-access-gqsrd for pod openshift-network-diagnostics/network-check-target-qfmn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:19.520397 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.520367 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd podName:fac69899-e7fe-4a77-b0b3-504c9f451bdf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:51.520352749 +0000 UTC m=+66.328605899 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqsrd" (UniqueName: "kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd") pod "network-check-target-qfmn8" (UID: "fac69899-e7fe-4a77-b0b3-504c9f451bdf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:19.773149 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.773070 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:19.773149 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.773097 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:19.773473 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.773341 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:19.776110 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.776090 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:19.777164 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.777123 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:19.777164 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.777147 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:19.777336 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.777168 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:19.777336 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.777203 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2r56c\"" Apr 16 22:14:19.777336 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.777231 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dlkb\"" Apr 16 22:14:19.822059 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.822034 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:19.822370 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.822167 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:19.822370 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.822222 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:21.822205261 +0000 UTC m=+36.630458401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:19.922563 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:19.922527 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:19.922735 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.922682 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:19.922796 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:19.922748 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:21.922731352 +0000 UTC m=+36.730984482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:21.837215 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:21.837031 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:21.837575 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:21.837189 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:21.837575 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:21.837325 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:25.837310109 +0000 UTC m=+40.645563239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:21.938149 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:21.938121 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:21.938302 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:21.938219 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:21.938302 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:21.938268 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:25.938253399 +0000 UTC m=+40.746506529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:21.959836 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:21.959752 2560 generic.go:358] "Generic (PLEG): container finished" podID="e68012e7-cccf-4da0-86e1-1355b80e2784" containerID="8f1c0589f82f221f768a3b71169699fd92d5582a5e71dddb6dfa8c0cdb3f66a2" exitCode=0 Apr 16 22:14:21.959836 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:21.959796 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerDied","Data":"8f1c0589f82f221f768a3b71169699fd92d5582a5e71dddb6dfa8c0cdb3f66a2"} Apr 16 22:14:22.039045 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.039010 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:22.049052 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.049023 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/816c1425-a29e-4e5e-b5ae-ad9b214b5349-original-pull-secret\") pod \"global-pull-secret-syncer-9lc9p\" (UID: \"816c1425-a29e-4e5e-b5ae-ad9b214b5349\") " pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:22.197610 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.197575 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9lc9p" Apr 16 22:14:22.390326 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.390294 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9lc9p"] Apr 16 22:14:22.393856 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:14:22.393831 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod816c1425_a29e_4e5e_b5ae_ad9b214b5349.slice/crio-094e21d1a87308d6114972ffc5c0cbe2029025fd77c5b9436fe8039e9497867c WatchSource:0}: Error finding container 094e21d1a87308d6114972ffc5c0cbe2029025fd77c5b9436fe8039e9497867c: Status 404 returned error can't find the container with id 094e21d1a87308d6114972ffc5c0cbe2029025fd77c5b9436fe8039e9497867c Apr 16 22:14:22.963265 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.963226 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9lc9p" event={"ID":"816c1425-a29e-4e5e-b5ae-ad9b214b5349","Type":"ContainerStarted","Data":"094e21d1a87308d6114972ffc5c0cbe2029025fd77c5b9436fe8039e9497867c"} Apr 16 22:14:22.966259 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.966229 2560 generic.go:358] "Generic (PLEG): container finished" podID="e68012e7-cccf-4da0-86e1-1355b80e2784" containerID="68de6986055ed12a48dc3b4a680e7d88daa478b39b4b17d41024f9d32000a273" exitCode=0 Apr 16 22:14:22.966359 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:22.966274 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerDied","Data":"68de6986055ed12a48dc3b4a680e7d88daa478b39b4b17d41024f9d32000a273"} Apr 16 22:14:23.974694 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:23.974505 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" event={"ID":"e68012e7-cccf-4da0-86e1-1355b80e2784","Type":"ContainerStarted","Data":"62091c9538a0d7266746a36bbf71daf55605fb1efa6396d183f1c6a3923a92bb"} Apr 16 22:14:23.999842 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:23.999797 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9sv6h" podStartSLOduration=6.535159812 podStartE2EDuration="38.999784059s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:13:48.3262312 +0000 UTC m=+3.134484340" lastFinishedPulling="2026-04-16 22:14:20.790855457 +0000 UTC m=+35.599108587" observedRunningTime="2026-04-16 22:14:23.998178251 +0000 UTC m=+38.806431403" watchObservedRunningTime="2026-04-16 22:14:23.999784059 +0000 UTC m=+38.808037232" Apr 16 22:14:25.867328 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:25.867284 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:25.867791 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:25.867419 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:25.867791 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:25.867509 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.867484414 +0000 UTC m=+48.675737564 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:25.967898 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:25.967846 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:25.968084 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:25.968008 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:25.968084 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:25.968084 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.968064053 +0000 UTC m=+48.776317197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:26.982130 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:26.982098 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9lc9p" event={"ID":"816c1425-a29e-4e5e-b5ae-ad9b214b5349","Type":"ContainerStarted","Data":"24aefee94388d4cd52623a6cc02ff90914f1d6ee912cf3f0e04c66d5b78f724d"} Apr 16 22:14:27.982489 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.982437 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9lc9p" podStartSLOduration=34.332643124 podStartE2EDuration="37.982422074s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:14:22.395573955 +0000 UTC m=+37.203827089" lastFinishedPulling="2026-04-16 22:14:26.04535291 +0000 UTC m=+40.853606039" observedRunningTime="2026-04-16 22:14:26.995621438 +0000 UTC m=+41.803874590" watchObservedRunningTime="2026-04-16 22:14:27.982422074 +0000 UTC m=+42.790675237" Apr 16 22:14:27.982845 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.982709 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll"] Apr 16 22:14:27.985549 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.985528 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:27.988626 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.988595 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:27.989119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.989103 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:27.989847 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.989830 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:27.989954 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.989831 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 22:14:27.989954 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.989890 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-55lsm\"" Apr 16 22:14:27.996355 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:27.996333 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll"] Apr 16 22:14:28.015571 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.015541 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc"] Apr 16 22:14:28.018281 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.018265 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.021383 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.021360 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 22:14:28.021503 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.021397 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 22:14:28.021503 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.021360 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 22:14:28.021503 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.021484 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 22:14:28.027272 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.027253 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc"] Apr 16 22:14:28.083114 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.083080 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/35f19367-f071-4514-8913-616a5b523ec9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84df6d66f9-rpxll\" (UID: \"35f19367-f071-4514-8913-616a5b523ec9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.083284 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.083121 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm5zd\" (UniqueName: \"kubernetes.io/projected/35f19367-f071-4514-8913-616a5b523ec9-kube-api-access-vm5zd\") pod \"managed-serviceaccount-addon-agent-84df6d66f9-rpxll\" (UID: \"35f19367-f071-4514-8913-616a5b523ec9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.183657 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183608 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.183839 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183691 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.183839 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183719 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/35f19367-f071-4514-8913-616a5b523ec9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84df6d66f9-rpxll\" (UID: \"35f19367-f071-4514-8913-616a5b523ec9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.183839 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183739 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-hub\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.183839 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183764 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fqh9\" (UniqueName: \"kubernetes.io/projected/8098b888-0ffc-4df4-85d4-8ede78080c9a-kube-api-access-6fqh9\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.183839 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183817 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vm5zd\" (UniqueName: \"kubernetes.io/projected/35f19367-f071-4514-8913-616a5b523ec9-kube-api-access-vm5zd\") pod \"managed-serviceaccount-addon-agent-84df6d66f9-rpxll\" (UID: \"35f19367-f071-4514-8913-616a5b523ec9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.184056 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183851 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8098b888-0ffc-4df4-85d4-8ede78080c9a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.184056 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.183931 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-ca\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.187306 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.187287 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/35f19367-f071-4514-8913-616a5b523ec9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84df6d66f9-rpxll\" (UID: \"35f19367-f071-4514-8913-616a5b523ec9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.192234 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.192211 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm5zd\" (UniqueName: \"kubernetes.io/projected/35f19367-f071-4514-8913-616a5b523ec9-kube-api-access-vm5zd\") pod \"managed-serviceaccount-addon-agent-84df6d66f9-rpxll\" (UID: \"35f19367-f071-4514-8913-616a5b523ec9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.285217 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.285111 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-ca\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.285217 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.285194 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.285413 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.285256 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.285413 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.285284 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-hub\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.285413 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.285308 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fqh9\" (UniqueName: \"kubernetes.io/projected/8098b888-0ffc-4df4-85d4-8ede78080c9a-kube-api-access-6fqh9\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.285413 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.285334 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8098b888-0ffc-4df4-85d4-8ede78080c9a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.286218 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.286188 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8098b888-0ffc-4df4-85d4-8ede78080c9a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.287675 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.287644 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-ca\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.287789 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.287673 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.287859 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.287844 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.288033 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.288017 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8098b888-0ffc-4df4-85d4-8ede78080c9a-hub\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.293708 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.293682 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fqh9\" (UniqueName: \"kubernetes.io/projected/8098b888-0ffc-4df4-85d4-8ede78080c9a-kube-api-access-6fqh9\") pod \"cluster-proxy-proxy-agent-684689bfb9-lp2jc\" (UID: \"8098b888-0ffc-4df4-85d4-8ede78080c9a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.306510 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.306487 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" Apr 16 22:14:28.327290 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.327256 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:14:28.450524 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.450496 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll"] Apr 16 22:14:28.453533 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:14:28.453504 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f19367_f071_4514_8913_616a5b523ec9.slice/crio-bbdff611831c8d62725b899339d2fca4516569e8a0612a55febc65c876849b02 WatchSource:0}: Error finding container bbdff611831c8d62725b899339d2fca4516569e8a0612a55febc65c876849b02: Status 404 returned error can't find the container with id bbdff611831c8d62725b899339d2fca4516569e8a0612a55febc65c876849b02 Apr 16 22:14:28.469166 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.469136 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc"] Apr 16 22:14:28.472450 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:14:28.472425 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8098b888_0ffc_4df4_85d4_8ede78080c9a.slice/crio-8a84d51013392e898a10ae07b5ce98bbc15cf5f075b6dc100028b02682ceb58c WatchSource:0}: Error finding container 8a84d51013392e898a10ae07b5ce98bbc15cf5f075b6dc100028b02682ceb58c: Status 404 returned error can't find the container with id 8a84d51013392e898a10ae07b5ce98bbc15cf5f075b6dc100028b02682ceb58c Apr 16 22:14:28.987556 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.987516 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" event={"ID":"8098b888-0ffc-4df4-85d4-8ede78080c9a","Type":"ContainerStarted","Data":"8a84d51013392e898a10ae07b5ce98bbc15cf5f075b6dc100028b02682ceb58c"} Apr 16 22:14:28.988816 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:28.988776 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" event={"ID":"35f19367-f071-4514-8913-616a5b523ec9","Type":"ContainerStarted","Data":"bbdff611831c8d62725b899339d2fca4516569e8a0612a55febc65c876849b02"} Apr 16 22:14:31.996601 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:31.996508 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" event={"ID":"8098b888-0ffc-4df4-85d4-8ede78080c9a","Type":"ContainerStarted","Data":"bcb1b5d2dce8ff9fd41823f75a21fd53ecc7667671ca6d185f2ce6760b3bd5fc"} Apr 16 22:14:31.998089 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:31.998061 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" event={"ID":"35f19367-f071-4514-8913-616a5b523ec9","Type":"ContainerStarted","Data":"fe970223d19698ba17bab29f91e77648dff7774f3bc5805b30799dd73880e6e1"} Apr 16 22:14:32.018901 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:32.018834 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" podStartSLOduration=2.034549358 podStartE2EDuration="5.018819715s" podCreationTimestamp="2026-04-16 22:14:27 +0000 UTC" firstStartedPulling="2026-04-16 22:14:28.455367459 +0000 UTC m=+43.263620589" lastFinishedPulling="2026-04-16 22:14:31.439637816 +0000 UTC m=+46.247890946" observedRunningTime="2026-04-16 22:14:32.01807647 +0000 UTC m=+46.826329624" watchObservedRunningTime="2026-04-16 22:14:32.018819715 +0000 UTC m=+46.827072867" Apr 16 22:14:33.929715 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:33.929657 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:33.930106 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:33.929789 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:33.930106 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:33.929844 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:49.929830391 +0000 UTC m=+64.738083522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:34.003943 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:34.003909 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" event={"ID":"8098b888-0ffc-4df4-85d4-8ede78080c9a","Type":"ContainerStarted","Data":"2c26cf86cf91407256880227404e02a88df62ffd7c148d57955569b9109ed929"} Apr 16 22:14:34.003943 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:34.003943 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" event={"ID":"8098b888-0ffc-4df4-85d4-8ede78080c9a","Type":"ContainerStarted","Data":"1771b8b9c353ed465d37fc8a891e3356d5a0350a18238f18f90a51a1d689e234"} Apr 16 22:14:34.022316 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:34.022263 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" podStartSLOduration=2.300645221 podStartE2EDuration="7.022248921s" podCreationTimestamp="2026-04-16 22:14:27 +0000 UTC" firstStartedPulling="2026-04-16 22:14:28.474204215 +0000 UTC m=+43.282457345" lastFinishedPulling="2026-04-16 22:14:33.195807902 +0000 UTC m=+48.004061045" observedRunningTime="2026-04-16 22:14:34.021523337 +0000 UTC m=+48.829776490" watchObservedRunningTime="2026-04-16 22:14:34.022248921 +0000 UTC m=+48.830502074" Apr 16 22:14:34.030865 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:34.030844 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:34.031056 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:34.031035 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:34.031110 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:34.031100 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:50.031082905 +0000 UTC m=+64.839336048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:43.955271 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:43.955242 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5ggql" Apr 16 22:14:49.941031 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:49.940976 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:14:49.941406 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:49.941134 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:49.941406 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:49.941200 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:21.941182672 +0000 UTC m=+96.749435802 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:50.041288 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:50.041255 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:14:50.041463 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:50.041409 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:50.041527 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:50.041475 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:22.04145595 +0000 UTC m=+96.849709094 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:14:51.451067 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.451016 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:14:51.453991 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.453970 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:51.462092 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:51.462069 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:51.462195 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:14:51.462145 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:55.462123025 +0000 UTC m=+130.270376161 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : secret "metrics-daemon-secret" not found Apr 16 22:14:51.552232 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.552199 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:51.555168 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.555150 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:51.565232 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.565217 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:51.575395 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.575376 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqsrd\" (UniqueName: \"kubernetes.io/projected/fac69899-e7fe-4a77-b0b3-504c9f451bdf-kube-api-access-gqsrd\") pod \"network-check-target-qfmn8\" (UID: \"fac69899-e7fe-4a77-b0b3-504c9f451bdf\") " pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:51.587857 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.587831 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2r56c\"" Apr 16 22:14:51.596237 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.596218 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:51.703435 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:51.703367 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qfmn8"] Apr 16 22:14:51.706325 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:14:51.706294 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac69899_e7fe_4a77_b0b3_504c9f451bdf.slice/crio-3a24065f6e755b7cfd6ab1ba0faf4b1dcca4a2ec5e99f6790d77c8c0897a0d29 WatchSource:0}: Error finding container 3a24065f6e755b7cfd6ab1ba0faf4b1dcca4a2ec5e99f6790d77c8c0897a0d29: Status 404 returned error can't find the container with id 3a24065f6e755b7cfd6ab1ba0faf4b1dcca4a2ec5e99f6790d77c8c0897a0d29 Apr 16 22:14:52.038007 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:52.037924 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qfmn8" event={"ID":"fac69899-e7fe-4a77-b0b3-504c9f451bdf","Type":"ContainerStarted","Data":"3a24065f6e755b7cfd6ab1ba0faf4b1dcca4a2ec5e99f6790d77c8c0897a0d29"} Apr 16 22:14:55.047107 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:55.047068 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qfmn8" event={"ID":"fac69899-e7fe-4a77-b0b3-504c9f451bdf","Type":"ContainerStarted","Data":"cbe89ff98b91632d47b3f454159c9554d3258c50f0a89711b368152e857af89c"} Apr 16 22:14:55.047513 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:55.047233 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:14:55.063604 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:14:55.063560 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qfmn8" podStartSLOduration=67.422237582 podStartE2EDuration="1m10.063547949s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:14:51.708149604 +0000 UTC m=+66.516402734" lastFinishedPulling="2026-04-16 22:14:54.349459971 +0000 UTC m=+69.157713101" observedRunningTime="2026-04-16 22:14:55.062129513 +0000 UTC m=+69.870382665" watchObservedRunningTime="2026-04-16 22:14:55.063547949 +0000 UTC m=+69.871801101" Apr 16 22:15:21.975943 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:15:21.975860 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:15:21.976301 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:15:21.976008 2560 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:21.976301 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:15:21.976085 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls podName:d49b12b0-4e0c-4cb5-baba-dda53600ba56 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:25.97606379 +0000 UTC m=+160.784316920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls") pod "dns-default-rwmdq" (UID: "d49b12b0-4e0c-4cb5-baba-dda53600ba56") : secret "dns-default-metrics-tls" not found Apr 16 22:15:22.076772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:15:22.076737 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:15:22.076938 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:15:22.076847 2560 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:22.076938 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:15:22.076916 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert podName:47261062-7dc9-439a-ab78-089432ccd885 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:26.076901828 +0000 UTC m=+160.885154957 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert") pod "ingress-canary-dnq97" (UID: "47261062-7dc9-439a-ab78-089432ccd885") : secret "canary-serving-cert" not found Apr 16 22:15:26.051268 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:15:26.051235 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qfmn8" Apr 16 22:15:55.506561 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:15:55.506518 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:15:55.507092 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:15:55.506644 2560 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:55.507092 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:15:55.506701 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs podName:f45f69f8-87a8-49f9-bb8b-485368427802 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:57.506686519 +0000 UTC m=+252.314939649 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs") pod "network-metrics-daemon-cph62" (UID: "f45f69f8-87a8-49f9-bb8b-485368427802") : secret "metrics-daemon-secret" not found Apr 16 22:16:00.547030 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:00.547002 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p58zd_81cb0b16-9454-40e0-907f-db4de6741a0c/dns-node-resolver/0.log" Apr 16 22:16:01.345397 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:01.345366 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bn8st_11442be0-1bee-4a8d-8374-44ea164b6268/node-ca/0.log" Apr 16 22:16:21.128885 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:21.128825 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rwmdq" podUID="d49b12b0-4e0c-4cb5-baba-dda53600ba56" Apr 16 22:16:21.150995 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:21.150969 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dnq97" podUID="47261062-7dc9-439a-ab78-089432ccd885" Apr 16 22:16:21.244575 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:21.244545 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwmdq" Apr 16 22:16:22.091455 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.091427 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rspvv"] Apr 16 22:16:22.096301 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.096274 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.100228 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.100207 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:16:22.100350 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.100265 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k7nbm\"" Apr 16 22:16:22.100350 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.100288 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:16:22.100350 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.100299 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:22.100350 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.100265 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:22.104801 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.104779 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rspvv"] Apr 16 22:16:22.188072 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.188039 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-data-volume\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.188422 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.188078 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.188422 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.188100 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-crio-socket\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.188422 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.188119 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.188422 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.188143 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfnn\" (UniqueName: \"kubernetes.io/projected/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-kube-api-access-2jfnn\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.213762 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.213729 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d6966496b-phxbk"] Apr 16 22:16:22.216535 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.216519 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.219257 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.219233 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:16:22.219378 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.219299 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48hdh\"" Apr 16 22:16:22.219553 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.219537 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:16:22.219600 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.219562 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:16:22.225344 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.225320 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:16:22.232624 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.232603 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d6966496b-phxbk"] Apr 16 22:16:22.288941 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.288915 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-data-volume\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289068 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.288952 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289068 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.288981 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-image-registry-private-configuration\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289068 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289002 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-crio-socket\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289068 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289039 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-ca-trust-extracted\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289068 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289059 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-installation-pull-secrets\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289075 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-registry-tls\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289090 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48j8q\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-kube-api-access-48j8q\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289090 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-crio-socket\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289177 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfnn\" (UniqueName: \"kubernetes.io/projected/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-kube-api-access-2jfnn\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289253 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-bound-sa-token\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289260 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-data-volume\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289299 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-registry-certificates\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289529 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289321 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.289529 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289342 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-trusted-ca\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.289529 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.289510 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.291431 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.291414 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.300445 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.300424 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfnn\" (UniqueName: \"kubernetes.io/projected/c2c07ee2-f936-48dd-bba9-993d5f5a4d00-kube-api-access-2jfnn\") pod \"insights-runtime-extractor-rspvv\" (UID: \"c2c07ee2-f936-48dd-bba9-993d5f5a4d00\") " pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.389696 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389614 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-registry-certificates\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.389696 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389654 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-trusted-ca\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.389696 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389682 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-image-registry-private-configuration\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.389913 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389713 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-ca-trust-extracted\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.389913 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389860 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-installation-pull-secrets\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.389989 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389923 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-registry-tls\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.389989 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.389948 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48j8q\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-kube-api-access-48j8q\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.390424 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.390154 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-ca-trust-extracted\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.390424 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.390162 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-bound-sa-token\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.390548 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.390530 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-registry-certificates\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.391050 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.391027 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-trusted-ca\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.392334 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.392307 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-installation-pull-secrets\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.392434 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.392360 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-image-registry-private-configuration\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.392493 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.392482 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-registry-tls\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.402357 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.402329 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-bound-sa-token\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.403567 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.403540 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48j8q\" (UniqueName: \"kubernetes.io/projected/c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d-kube-api-access-48j8q\") pod \"image-registry-5d6966496b-phxbk\" (UID: \"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d\") " pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.405180 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.405165 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rspvv" Apr 16 22:16:22.526904 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.526850 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:22.530555 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.530529 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rspvv"] Apr 16 22:16:22.534501 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:22.534468 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c07ee2_f936_48dd_bba9_993d5f5a4d00.slice/crio-91d5e655d92139e5d20c379e795a8d2ae7723a42315289955cd6d05fd7e11b69 WatchSource:0}: Error finding container 91d5e655d92139e5d20c379e795a8d2ae7723a42315289955cd6d05fd7e11b69: Status 404 returned error can't find the container with id 91d5e655d92139e5d20c379e795a8d2ae7723a42315289955cd6d05fd7e11b69 Apr 16 22:16:22.652353 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:22.652278 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d6966496b-phxbk"] Apr 16 22:16:22.654754 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:22.654720 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4a5d274_9cc4_4ba2_b606_f554ecfd3f5d.slice/crio-948c41759659b100ce9858887e969236fef2f95c9cab4995b79db6fcb27beecf WatchSource:0}: Error finding container 948c41759659b100ce9858887e969236fef2f95c9cab4995b79db6fcb27beecf: Status 404 returned error can't find the container with id 948c41759659b100ce9858887e969236fef2f95c9cab4995b79db6fcb27beecf Apr 16 22:16:22.791664 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:22.791613 2560 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cph62" podUID="f45f69f8-87a8-49f9-bb8b-485368427802" Apr 16 22:16:23.250712 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:23.250677 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" event={"ID":"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d","Type":"ContainerStarted","Data":"b1807a46ad123419df24510ecf6bbadf0fbcaab46ae55d34df6f8da5f0d368dc"} Apr 16 22:16:23.251089 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:23.250712 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" event={"ID":"c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d","Type":"ContainerStarted","Data":"948c41759659b100ce9858887e969236fef2f95c9cab4995b79db6fcb27beecf"} Apr 16 22:16:23.251089 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:23.250812 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:23.252101 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:23.252081 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rspvv" event={"ID":"c2c07ee2-f936-48dd-bba9-993d5f5a4d00","Type":"ContainerStarted","Data":"dc9e2042b4222881e205cfdaf54820205921eb755e651948e888c98e663f7faa"} Apr 16 22:16:23.252190 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:23.252105 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rspvv" event={"ID":"c2c07ee2-f936-48dd-bba9-993d5f5a4d00","Type":"ContainerStarted","Data":"91d5e655d92139e5d20c379e795a8d2ae7723a42315289955cd6d05fd7e11b69"} Apr 16 22:16:23.271584 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:23.271547 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" podStartSLOduration=1.271533083 podStartE2EDuration="1.271533083s" podCreationTimestamp="2026-04-16 22:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:23.270728597 +0000 UTC m=+158.078981751" watchObservedRunningTime="2026-04-16 22:16:23.271533083 +0000 UTC m=+158.079786266" Apr 16 22:16:24.259746 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:24.259709 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rspvv" event={"ID":"c2c07ee2-f936-48dd-bba9-993d5f5a4d00","Type":"ContainerStarted","Data":"f5532c362f666730517d92311731bd67c427a680275c752c4c3ed4c122444e2c"} Apr 16 22:16:25.264232 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:25.264193 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rspvv" event={"ID":"c2c07ee2-f936-48dd-bba9-993d5f5a4d00","Type":"ContainerStarted","Data":"399bee64f2304688208e91e40c048ce69763215f8858470b2793e34e9474cfb6"} Apr 16 22:16:25.285976 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:25.285927 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rspvv" podStartSLOduration=1.100830179 podStartE2EDuration="3.285914324s" podCreationTimestamp="2026-04-16 22:16:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.60505724 +0000 UTC m=+157.413310373" lastFinishedPulling="2026-04-16 22:16:24.790141386 +0000 UTC m=+159.598394518" observedRunningTime="2026-04-16 22:16:25.284319551 +0000 UTC m=+160.092572702" watchObservedRunningTime="2026-04-16 22:16:25.285914324 +0000 UTC m=+160.094167468" Apr 16 22:16:26.015616 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.015568 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:16:26.017831 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.017804 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d49b12b0-4e0c-4cb5-baba-dda53600ba56-metrics-tls\") pod \"dns-default-rwmdq\" (UID: \"d49b12b0-4e0c-4cb5-baba-dda53600ba56\") " pod="openshift-dns/dns-default-rwmdq" Apr 16 22:16:26.048675 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.048646 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fk2j6\"" Apr 16 22:16:26.055480 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.055461 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rwmdq" Apr 16 22:16:26.116022 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.115987 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:16:26.119339 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.119313 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47261062-7dc9-439a-ab78-089432ccd885-cert\") pod \"ingress-canary-dnq97\" (UID: \"47261062-7dc9-439a-ab78-089432ccd885\") " pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:16:26.172609 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.172564 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rwmdq"] Apr 16 22:16:26.176222 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:26.176194 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49b12b0_4e0c_4cb5_baba_dda53600ba56.slice/crio-964f602aa8b4ed7f3ba43a75abb8d539d45489c9c8729b0bf0d32eef3f66fafe WatchSource:0}: Error finding container 964f602aa8b4ed7f3ba43a75abb8d539d45489c9c8729b0bf0d32eef3f66fafe: Status 404 returned error can't find the container with id 964f602aa8b4ed7f3ba43a75abb8d539d45489c9c8729b0bf0d32eef3f66fafe Apr 16 22:16:26.267935 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:26.267840 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwmdq" event={"ID":"d49b12b0-4e0c-4cb5-baba-dda53600ba56","Type":"ContainerStarted","Data":"964f602aa8b4ed7f3ba43a75abb8d539d45489c9c8729b0bf0d32eef3f66fafe"} Apr 16 22:16:28.275971 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:28.275891 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwmdq" event={"ID":"d49b12b0-4e0c-4cb5-baba-dda53600ba56","Type":"ContainerStarted","Data":"7090abffb937ffe1eb0db760ccae7486752702c9da9b86b0a805fa9e1bbf8a43"} Apr 16 22:16:28.275971 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:28.275927 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rwmdq" event={"ID":"d49b12b0-4e0c-4cb5-baba-dda53600ba56","Type":"ContainerStarted","Data":"b8e41792bf9d7ed54dd5855b2795dbed7f8f013c1f5dbc8ff0d1644821bf36af"} Apr 16 22:16:28.276376 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:28.276032 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rwmdq" Apr 16 22:16:28.296916 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:28.295624 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rwmdq" podStartSLOduration=128.914324226 podStartE2EDuration="2m10.295606401s" podCreationTimestamp="2026-04-16 22:14:18 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.178091158 +0000 UTC m=+160.986344291" lastFinishedPulling="2026-04-16 22:16:27.559373332 +0000 UTC m=+162.367626466" observedRunningTime="2026-04-16 22:16:28.292615743 +0000 UTC m=+163.100868894" watchObservedRunningTime="2026-04-16 22:16:28.295606401 +0000 UTC m=+163.103859555" Apr 16 22:16:31.772711 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:31.772674 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:16:31.775622 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:31.775599 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-86pmr\"" Apr 16 22:16:31.783675 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:31.783661 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dnq97" Apr 16 22:16:31.896192 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:31.896159 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dnq97"] Apr 16 22:16:31.900158 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:31.900129 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47261062_7dc9_439a_ab78_089432ccd885.slice/crio-ce3c27fd8f10cd5f50630ccdacc6075e2302046ed8c991232d8d2891b665d897 WatchSource:0}: Error finding container ce3c27fd8f10cd5f50630ccdacc6075e2302046ed8c991232d8d2891b665d897: Status 404 returned error can't find the container with id ce3c27fd8f10cd5f50630ccdacc6075e2302046ed8c991232d8d2891b665d897 Apr 16 22:16:32.286284 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:32.286193 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dnq97" event={"ID":"47261062-7dc9-439a-ab78-089432ccd885","Type":"ContainerStarted","Data":"ce3c27fd8f10cd5f50630ccdacc6075e2302046ed8c991232d8d2891b665d897"} Apr 16 22:16:32.287390 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:32.287365 2560 generic.go:358] "Generic (PLEG): container finished" podID="35f19367-f071-4514-8913-616a5b523ec9" containerID="fe970223d19698ba17bab29f91e77648dff7774f3bc5805b30799dd73880e6e1" exitCode=255 Apr 16 22:16:32.287513 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:32.287428 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" event={"ID":"35f19367-f071-4514-8913-616a5b523ec9","Type":"ContainerDied","Data":"fe970223d19698ba17bab29f91e77648dff7774f3bc5805b30799dd73880e6e1"} Apr 16 22:16:32.287765 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:32.287749 2560 scope.go:117] "RemoveContainer" containerID="fe970223d19698ba17bab29f91e77648dff7774f3bc5805b30799dd73880e6e1" Apr 16 22:16:33.291715 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:33.291672 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84df6d66f9-rpxll" event={"ID":"35f19367-f071-4514-8913-616a5b523ec9","Type":"ContainerStarted","Data":"8f7b50283826d96f9aa5b3cdea67e6e54dbded23f3650879dbb1b7ee102e52db"} Apr 16 22:16:33.773008 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:33.772975 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:16:34.295607 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:34.295568 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dnq97" event={"ID":"47261062-7dc9-439a-ab78-089432ccd885","Type":"ContainerStarted","Data":"a558bd906aa36a91252ea73ca66fada5400b0e3bb67283dd2d71934e7f8292a3"} Apr 16 22:16:34.314054 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:34.314007 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dnq97" podStartSLOduration=134.787114491 podStartE2EDuration="2m16.313993133s" podCreationTimestamp="2026-04-16 22:14:18 +0000 UTC" firstStartedPulling="2026-04-16 22:16:31.901962247 +0000 UTC m=+166.710215377" lastFinishedPulling="2026-04-16 22:16:33.428840886 +0000 UTC m=+168.237094019" observedRunningTime="2026-04-16 22:16:34.313456524 +0000 UTC m=+169.121709668" watchObservedRunningTime="2026-04-16 22:16:34.313993133 +0000 UTC m=+169.122246355" Apr 16 22:16:37.733979 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.733944 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fxpxv"] Apr 16 22:16:37.737016 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.736997 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.739531 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.739510 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:37.739652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.739562 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:37.739652 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.739579 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:37.739811 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.739798 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:37.739933 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.739915 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:37.740690 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.740678 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:37.740751 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.740731 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pvvh7\"" Apr 16 22:16:37.799150 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799127 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-root\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799281 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799156 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-sys\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799281 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799173 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-textfile\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799281 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799188 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799281 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799240 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-accelerators-collector-config\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799281 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799276 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxvkb\" (UniqueName: \"kubernetes.io/projected/21a3f8ac-083c-407f-b480-6676a3c5c069-kube-api-access-vxvkb\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799468 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799308 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-wtmp\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799468 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799338 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21a3f8ac-083c-407f-b480-6676a3c5c069-metrics-client-ca\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.799468 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.799355 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.899626 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899591 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxvkb\" (UniqueName: \"kubernetes.io/projected/21a3f8ac-083c-407f-b480-6676a3c5c069-kube-api-access-vxvkb\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.899807 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899642 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-wtmp\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.899807 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899682 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21a3f8ac-083c-407f-b480-6676a3c5c069-metrics-client-ca\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.899807 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899706 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.899807 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899789 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-root\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899821 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-wtmp\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899831 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-sys\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899852 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-textfile\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899864 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-root\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899885 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899914 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21a3f8ac-083c-407f-b480-6676a3c5c069-sys\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.899921 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-accelerators-collector-config\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900051 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:37.899999 2560 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:37.900346 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:37.900082 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls podName:21a3f8ac-083c-407f-b480-6676a3c5c069 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:38.400061147 +0000 UTC m=+173.208314278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls") pod "node-exporter-fxpxv" (UID: "21a3f8ac-083c-407f-b480-6676a3c5c069") : secret "node-exporter-tls" not found Apr 16 22:16:37.900346 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.900275 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-textfile\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900484 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.900467 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-accelerators-collector-config\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.900606 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.900588 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21a3f8ac-083c-407f-b480-6676a3c5c069-metrics-client-ca\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.902119 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.902100 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:37.910288 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:37.910260 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxvkb\" (UniqueName: \"kubernetes.io/projected/21a3f8ac-083c-407f-b480-6676a3c5c069-kube-api-access-vxvkb\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:38.280592 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:38.280562 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rwmdq" Apr 16 22:16:38.404678 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:38.404646 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:38.404881 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:38.404797 2560 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:38.404948 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:16:38.404881 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls podName:21a3f8ac-083c-407f-b480-6676a3c5c069 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:39.404852084 +0000 UTC m=+174.213105214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls") pod "node-exporter-fxpxv" (UID: "21a3f8ac-083c-407f-b480-6676a3c5c069") : secret "node-exporter-tls" not found Apr 16 22:16:39.410466 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:39.410421 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:39.412673 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:39.412650 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21a3f8ac-083c-407f-b480-6676a3c5c069-node-exporter-tls\") pod \"node-exporter-fxpxv\" (UID: \"21a3f8ac-083c-407f-b480-6676a3c5c069\") " pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:39.546228 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:39.546194 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fxpxv" Apr 16 22:16:39.556289 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:39.556259 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21a3f8ac_083c_407f_b480_6676a3c5c069.slice/crio-22e7ee2fe4c00d9b53b6d76a2d9b6e456195de483dd5c5eca66f1db5925ff6dd WatchSource:0}: Error finding container 22e7ee2fe4c00d9b53b6d76a2d9b6e456195de483dd5c5eca66f1db5925ff6dd: Status 404 returned error can't find the container with id 22e7ee2fe4c00d9b53b6d76a2d9b6e456195de483dd5c5eca66f1db5925ff6dd Apr 16 22:16:40.312343 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:40.312312 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxpxv" event={"ID":"21a3f8ac-083c-407f-b480-6676a3c5c069","Type":"ContainerStarted","Data":"e9b288f09cb190727153b8a45680df0d8be0c659b3927019ae7ad3fa7a275087"} Apr 16 22:16:40.312472 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:40.312351 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxpxv" event={"ID":"21a3f8ac-083c-407f-b480-6676a3c5c069","Type":"ContainerStarted","Data":"22e7ee2fe4c00d9b53b6d76a2d9b6e456195de483dd5c5eca66f1db5925ff6dd"} Apr 16 22:16:41.316166 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:41.316133 2560 generic.go:358] "Generic (PLEG): container finished" podID="21a3f8ac-083c-407f-b480-6676a3c5c069" containerID="e9b288f09cb190727153b8a45680df0d8be0c659b3927019ae7ad3fa7a275087" exitCode=0 Apr 16 22:16:41.316527 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:41.316192 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxpxv" event={"ID":"21a3f8ac-083c-407f-b480-6676a3c5c069","Type":"ContainerDied","Data":"e9b288f09cb190727153b8a45680df0d8be0c659b3927019ae7ad3fa7a275087"} Apr 16 22:16:42.319992 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.319949 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxpxv" event={"ID":"21a3f8ac-083c-407f-b480-6676a3c5c069","Type":"ContainerStarted","Data":"b758a81fc1c0d1576e58ad802dff184fbe1eafaac1d8196600f5f384d709add1"} Apr 16 22:16:42.319992 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.319997 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxpxv" event={"ID":"21a3f8ac-083c-407f-b480-6676a3c5c069","Type":"ContainerStarted","Data":"2ce2242f1325139ebb8fdffc72be1a0d53b9cdf379dae7a4f61c3facfaf6d3d3"} Apr 16 22:16:42.350608 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.350555 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fxpxv" podStartSLOduration=4.669315383 podStartE2EDuration="5.350540395s" podCreationTimestamp="2026-04-16 22:16:37 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.558117924 +0000 UTC m=+174.366371064" lastFinishedPulling="2026-04-16 22:16:40.239342936 +0000 UTC m=+175.047596076" observedRunningTime="2026-04-16 22:16:42.348886509 +0000 UTC m=+177.157139652" watchObservedRunningTime="2026-04-16 22:16:42.350540395 +0000 UTC m=+177.158793546" Apr 16 22:16:42.908223 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.908189 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4"] Apr 16 22:16:42.911300 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.911284 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.914432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.914397 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 22:16:42.914563 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.914475 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 22:16:42.914563 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.914504 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 22:16:42.914563 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.914523 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 22:16:42.914563 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.914510 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-9wr96\"" Apr 16 22:16:42.914773 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.914739 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 22:16:42.919975 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.919956 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 22:16:42.923205 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.923186 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4"] Apr 16 22:16:42.935612 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935590 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4psrv\" (UniqueName: \"kubernetes.io/projected/f22f2730-da51-48ae-9486-69fd5fa3e87e-kube-api-access-4psrv\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935718 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935652 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935718 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935679 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-secret-telemeter-client\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935718 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935697 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-telemeter-client-tls\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935864 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935721 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-serving-certs-ca-bundle\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935864 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935745 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-metrics-client-ca\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935984 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935884 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:42.935984 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:42.935932 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-federate-client-tls\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037132 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037102 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4psrv\" (UniqueName: \"kubernetes.io/projected/f22f2730-da51-48ae-9486-69fd5fa3e87e-kube-api-access-4psrv\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037155 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037177 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-secret-telemeter-client\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037194 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-telemeter-client-tls\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037215 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-serving-certs-ca-bundle\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037230 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-metrics-client-ca\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037254 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.037308 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.037276 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-federate-client-tls\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.038037 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.038013 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-metrics-client-ca\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.038158 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.038123 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.038222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.038127 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22f2730-da51-48ae-9486-69fd5fa3e87e-serving-certs-ca-bundle\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.039768 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.039744 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-federate-client-tls\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.040072 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.040050 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-telemeter-client-tls\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.040124 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.040101 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.040172 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.040158 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f22f2730-da51-48ae-9486-69fd5fa3e87e-secret-telemeter-client\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.056156 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.056133 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4psrv\" (UniqueName: \"kubernetes.io/projected/f22f2730-da51-48ae-9486-69fd5fa3e87e-kube-api-access-4psrv\") pod \"telemeter-client-5dfcfc8856-ksrx4\" (UID: \"f22f2730-da51-48ae-9486-69fd5fa3e87e\") " pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.222201 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.222120 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" Apr 16 22:16:43.345105 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:43.345041 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4"] Apr 16 22:16:43.348026 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:43.347997 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22f2730_da51_48ae_9486_69fd5fa3e87e.slice/crio-361f2eee570cb81d183c2262d98e29f0428460289a25c3f81400a5db13759f19 WatchSource:0}: Error finding container 361f2eee570cb81d183c2262d98e29f0428460289a25c3f81400a5db13759f19: Status 404 returned error can't find the container with id 361f2eee570cb81d183c2262d98e29f0428460289a25c3f81400a5db13759f19 Apr 16 22:16:44.263906 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:44.263851 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d6966496b-phxbk" Apr 16 22:16:44.331452 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:44.331411 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" event={"ID":"f22f2730-da51-48ae-9486-69fd5fa3e87e","Type":"ContainerStarted","Data":"361f2eee570cb81d183c2262d98e29f0428460289a25c3f81400a5db13759f19"} Apr 16 22:16:45.335226 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:45.335196 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" event={"ID":"f22f2730-da51-48ae-9486-69fd5fa3e87e","Type":"ContainerStarted","Data":"bdac07cc626381c06f068dec07d4ffe7966f63e2c15bf7e01a13736782718a1a"} Apr 16 22:16:46.340133 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:46.340091 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" event={"ID":"f22f2730-da51-48ae-9486-69fd5fa3e87e","Type":"ContainerStarted","Data":"a5f3bea63d399c584ac95ca90843e577dc3a954b2b28740e380d15b393e6db3d"} Apr 16 22:16:46.340133 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:46.340134 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" event={"ID":"f22f2730-da51-48ae-9486-69fd5fa3e87e","Type":"ContainerStarted","Data":"26e8a9fe6916902787ef158f1513497f54c6ac777837f1470da2d851c9bd1b94"} Apr 16 22:16:46.378262 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:46.378211 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5dfcfc8856-ksrx4" podStartSLOduration=1.810894748 podStartE2EDuration="4.37819847s" podCreationTimestamp="2026-04-16 22:16:42 +0000 UTC" firstStartedPulling="2026-04-16 22:16:43.349766857 +0000 UTC m=+178.158019987" lastFinishedPulling="2026-04-16 22:16:45.917070579 +0000 UTC m=+180.725323709" observedRunningTime="2026-04-16 22:16:46.375918203 +0000 UTC m=+181.184171355" watchObservedRunningTime="2026-04-16 22:16:46.37819847 +0000 UTC m=+181.186451622" Apr 16 22:16:48.224378 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.224345 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58468845bd-fv8dm"] Apr 16 22:16:48.227461 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.227443 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.236862 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.236843 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:48.236973 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.236930 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:48.237778 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.237759 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:48.237908 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.237757 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:48.237981 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.237968 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:48.238047 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.238028 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:48.238103 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.238056 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7vb78\"" Apr 16 22:16:48.238151 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.238124 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:48.251432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.251413 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:48.257982 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.257962 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58468845bd-fv8dm"] Apr 16 22:16:48.277555 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277529 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-trusted-ca-bundle\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.277675 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277584 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-serving-cert\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.277730 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277700 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-oauth-config\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.277779 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277752 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-config\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.277828 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277783 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-service-ca\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.277828 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277819 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97lf5\" (UniqueName: \"kubernetes.io/projected/caad7f66-a0e5-4ee5-b065-c53bdf466613-kube-api-access-97lf5\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.277941 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.277851 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-oauth-serving-cert\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.378772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.378743 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-config\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.378772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.378774 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-service-ca\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.378772 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.378806 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97lf5\" (UniqueName: \"kubernetes.io/projected/caad7f66-a0e5-4ee5-b065-c53bdf466613-kube-api-access-97lf5\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379102 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.378916 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-oauth-serving-cert\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379102 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.378991 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-trusted-ca-bundle\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379102 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.379023 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-serving-cert\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379102 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.379064 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-oauth-config\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379490 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.379468 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-service-ca\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379621 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.379598 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-config\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.379748 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.379728 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-oauth-serving-cert\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.380377 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.380357 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-trusted-ca-bundle\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.381988 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.381962 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-oauth-config\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.382065 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.382048 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-serving-cert\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.388966 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.388946 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97lf5\" (UniqueName: \"kubernetes.io/projected/caad7f66-a0e5-4ee5-b065-c53bdf466613-kube-api-access-97lf5\") pod \"console-58468845bd-fv8dm\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.535909 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.535790 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:48.653944 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:48.653840 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58468845bd-fv8dm"] Apr 16 22:16:48.656462 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:16:48.656434 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaad7f66_a0e5_4ee5_b065_c53bdf466613.slice/crio-9f74d8e5d17186eb0a0a8588bd78d080b44d3bf54aded3774cb972ec8f1a6400 WatchSource:0}: Error finding container 9f74d8e5d17186eb0a0a8588bd78d080b44d3bf54aded3774cb972ec8f1a6400: Status 404 returned error can't find the container with id 9f74d8e5d17186eb0a0a8588bd78d080b44d3bf54aded3774cb972ec8f1a6400 Apr 16 22:16:49.351770 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:49.351724 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58468845bd-fv8dm" event={"ID":"caad7f66-a0e5-4ee5-b065-c53bdf466613","Type":"ContainerStarted","Data":"9f74d8e5d17186eb0a0a8588bd78d080b44d3bf54aded3774cb972ec8f1a6400"} Apr 16 22:16:51.358177 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:51.358137 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58468845bd-fv8dm" event={"ID":"caad7f66-a0e5-4ee5-b065-c53bdf466613","Type":"ContainerStarted","Data":"0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0"} Apr 16 22:16:51.382509 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:51.382461 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58468845bd-fv8dm" podStartSLOduration=0.905942971 podStartE2EDuration="3.38244732s" podCreationTimestamp="2026-04-16 22:16:48 +0000 UTC" firstStartedPulling="2026-04-16 22:16:48.658319115 +0000 UTC m=+183.466572244" lastFinishedPulling="2026-04-16 22:16:51.13482345 +0000 UTC m=+185.943076593" observedRunningTime="2026-04-16 22:16:51.381245128 +0000 UTC m=+186.189498294" watchObservedRunningTime="2026-04-16 22:16:51.38244732 +0000 UTC m=+186.190700472" Apr 16 22:16:58.536229 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:58.536173 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:58.536738 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:58.536356 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:58.541089 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:58.541066 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:16:59.383736 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:16:59.383711 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:17:18.328223 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:18.328162 2560 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" podUID="8098b888-0ffc-4df4-85d4-8ede78080c9a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:28.328668 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:28.328628 2560 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" podUID="8098b888-0ffc-4df4-85d4-8ede78080c9a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:38.329003 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:38.328966 2560 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" podUID="8098b888-0ffc-4df4-85d4-8ede78080c9a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:38.329454 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:38.329040 2560 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" Apr 16 22:17:38.329510 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:38.329479 2560 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"2c26cf86cf91407256880227404e02a88df62ffd7c148d57955569b9109ed929"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 22:17:38.329547 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:38.329529 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" podUID="8098b888-0ffc-4df4-85d4-8ede78080c9a" containerName="service-proxy" containerID="cri-o://2c26cf86cf91407256880227404e02a88df62ffd7c148d57955569b9109ed929" gracePeriod=30 Apr 16 22:17:38.488409 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:38.488383 2560 generic.go:358] "Generic (PLEG): container finished" podID="8098b888-0ffc-4df4-85d4-8ede78080c9a" containerID="2c26cf86cf91407256880227404e02a88df62ffd7c148d57955569b9109ed929" exitCode=2 Apr 16 22:17:38.488513 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:38.488423 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" event={"ID":"8098b888-0ffc-4df4-85d4-8ede78080c9a","Type":"ContainerDied","Data":"2c26cf86cf91407256880227404e02a88df62ffd7c148d57955569b9109ed929"} Apr 16 22:17:39.495397 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:39.495362 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-684689bfb9-lp2jc" event={"ID":"8098b888-0ffc-4df4-85d4-8ede78080c9a","Type":"ContainerStarted","Data":"3645fb1cba30fd364341ee5d958e959ea86de3186843234657aa6df00f816e09"} Apr 16 22:17:57.517994 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:57.517947 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:17:57.520329 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:57.520303 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f45f69f8-87a8-49f9-bb8b-485368427802-metrics-certs\") pod \"network-metrics-daemon-cph62\" (UID: \"f45f69f8-87a8-49f9-bb8b-485368427802\") " pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:17:57.776017 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:57.775938 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dlkb\"" Apr 16 22:17:57.784167 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:57.784149 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cph62" Apr 16 22:17:57.901322 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:57.901291 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cph62"] Apr 16 22:17:57.904363 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:17:57.904337 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf45f69f8_87a8_49f9_bb8b_485368427802.slice/crio-2adb7fd9ccbf1154df565d48841617ce2dfc252539e76952f96d3d1989044544 WatchSource:0}: Error finding container 2adb7fd9ccbf1154df565d48841617ce2dfc252539e76952f96d3d1989044544: Status 404 returned error can't find the container with id 2adb7fd9ccbf1154df565d48841617ce2dfc252539e76952f96d3d1989044544 Apr 16 22:17:58.547553 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:58.547467 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cph62" event={"ID":"f45f69f8-87a8-49f9-bb8b-485368427802","Type":"ContainerStarted","Data":"2adb7fd9ccbf1154df565d48841617ce2dfc252539e76952f96d3d1989044544"} Apr 16 22:17:59.552200 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:59.552166 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cph62" event={"ID":"f45f69f8-87a8-49f9-bb8b-485368427802","Type":"ContainerStarted","Data":"e6c72ab5f13ee3bb29256869c473dc2168531f632eb5ec874c18e0bf3cea925a"} Apr 16 22:17:59.552200 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:59.552202 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cph62" event={"ID":"f45f69f8-87a8-49f9-bb8b-485368427802","Type":"ContainerStarted","Data":"4a8162ee8b7eeb8b64ec75cbb103529111e3d5d63f9bd4f23da154f3b72c61f7"} Apr 16 22:17:59.569837 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:17:59.569751 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cph62" podStartSLOduration=253.710642579 podStartE2EDuration="4m14.569734164s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:17:57.906197682 +0000 UTC m=+252.714450818" lastFinishedPulling="2026-04-16 22:17:58.765289273 +0000 UTC m=+253.573542403" observedRunningTime="2026-04-16 22:17:59.569585737 +0000 UTC m=+254.377838889" watchObservedRunningTime="2026-04-16 22:17:59.569734164 +0000 UTC m=+254.377987317" Apr 16 22:18:23.660982 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:23.660948 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58468845bd-fv8dm"] Apr 16 22:18:45.655005 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:45.654979 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:18:45.657307 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:45.657285 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:18:45.661478 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:45.661460 2560 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:48.680241 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:48.680203 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58468845bd-fv8dm" podUID="caad7f66-a0e5-4ee5-b065-c53bdf466613" containerName="console" containerID="cri-o://0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0" gracePeriod=15 Apr 16 22:18:48.914167 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:48.914147 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58468845bd-fv8dm_caad7f66-a0e5-4ee5-b065-c53bdf466613/console/0.log" Apr 16 22:18:48.914274 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:48.914217 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:18:49.004193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004116 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-serving-cert\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004146 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-service-ca\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004166 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-oauth-serving-cert\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004193 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004198 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-oauth-config\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004515 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004226 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-trusted-ca-bundle\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004515 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004274 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-config\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004515 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004304 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97lf5\" (UniqueName: \"kubernetes.io/projected/caad7f66-a0e5-4ee5-b065-c53bdf466613-kube-api-access-97lf5\") pod \"caad7f66-a0e5-4ee5-b065-c53bdf466613\" (UID: \"caad7f66-a0e5-4ee5-b065-c53bdf466613\") " Apr 16 22:18:49.004675 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004640 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-service-ca" (OuterVolumeSpecName: "service-ca") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:49.004731 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004659 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:49.004731 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004667 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-config" (OuterVolumeSpecName: "console-config") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:49.004731 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.004659 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:49.006413 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.006376 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:49.006521 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.006448 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:49.006521 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.006465 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caad7f66-a0e5-4ee5-b065-c53bdf466613-kube-api-access-97lf5" (OuterVolumeSpecName: "kube-api-access-97lf5") pod "caad7f66-a0e5-4ee5-b065-c53bdf466613" (UID: "caad7f66-a0e5-4ee5-b065-c53bdf466613"). InnerVolumeSpecName "kube-api-access-97lf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:49.104987 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.104952 2560 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-serving-cert\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.104987 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.104979 2560 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-service-ca\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.104987 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.104989 2560 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-oauth-serving-cert\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.105187 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.104998 2560 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-oauth-config\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.105187 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.105007 2560 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-trusted-ca-bundle\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.105187 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.105015 2560 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/caad7f66-a0e5-4ee5-b065-c53bdf466613-console-config\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.105187 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.105025 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97lf5\" (UniqueName: \"kubernetes.io/projected/caad7f66-a0e5-4ee5-b065-c53bdf466613-kube-api-access-97lf5\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:18:49.684194 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.684166 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58468845bd-fv8dm_caad7f66-a0e5-4ee5-b065-c53bdf466613/console/0.log" Apr 16 22:18:49.684666 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.684207 2560 generic.go:358] "Generic (PLEG): container finished" podID="caad7f66-a0e5-4ee5-b065-c53bdf466613" containerID="0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0" exitCode=2 Apr 16 22:18:49.684666 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.684277 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58468845bd-fv8dm" event={"ID":"caad7f66-a0e5-4ee5-b065-c53bdf466613","Type":"ContainerDied","Data":"0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0"} Apr 16 22:18:49.684666 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.684293 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58468845bd-fv8dm" Apr 16 22:18:49.684666 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.684313 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58468845bd-fv8dm" event={"ID":"caad7f66-a0e5-4ee5-b065-c53bdf466613","Type":"ContainerDied","Data":"9f74d8e5d17186eb0a0a8588bd78d080b44d3bf54aded3774cb972ec8f1a6400"} Apr 16 22:18:49.684666 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.684329 2560 scope.go:117] "RemoveContainer" containerID="0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0" Apr 16 22:18:49.692599 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.692574 2560 scope.go:117] "RemoveContainer" containerID="0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0" Apr 16 22:18:49.692841 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:18:49.692821 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0\": container with ID starting with 0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0 not found: ID does not exist" containerID="0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0" Apr 16 22:18:49.692928 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.692854 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0"} err="failed to get container status \"0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0\": rpc error: code = NotFound desc = could not find container \"0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0\": container with ID starting with 0153e43f8d324e892054f03dcdc42a92ac1e7529be012f271f4b8515ae252aa0 not found: ID does not exist" Apr 16 22:18:49.703799 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.703770 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58468845bd-fv8dm"] Apr 16 22:18:49.707920 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.707899 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58468845bd-fv8dm"] Apr 16 22:18:49.777350 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:18:49.777316 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caad7f66-a0e5-4ee5-b065-c53bdf466613" path="/var/lib/kubelet/pods/caad7f66-a0e5-4ee5-b065-c53bdf466613/volumes" Apr 16 22:20:29.101134 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.101099 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-42hbz"] Apr 16 22:20:29.101528 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.101353 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caad7f66-a0e5-4ee5-b065-c53bdf466613" containerName="console" Apr 16 22:20:29.101528 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.101364 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="caad7f66-a0e5-4ee5-b065-c53bdf466613" containerName="console" Apr 16 22:20:29.101528 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.101411 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="caad7f66-a0e5-4ee5-b065-c53bdf466613" containerName="console" Apr 16 22:20:29.104095 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.104079 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.106725 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.106695 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 22:20:29.106852 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.106807 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:20:29.107667 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.107650 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-rvlzw\"" Apr 16 22:20:29.107758 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.107656 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:20:29.116650 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.116624 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-42hbz"] Apr 16 22:20:29.284011 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.283977 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fa23ecb-011f-4af6-ae90-66bd39889f28-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-42hbz\" (UID: \"3fa23ecb-011f-4af6-ae90-66bd39889f28\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.284190 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.284049 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwlm\" (UniqueName: \"kubernetes.io/projected/3fa23ecb-011f-4af6-ae90-66bd39889f28-kube-api-access-jdwlm\") pod \"llmisvc-controller-manager-68cc5db7c4-42hbz\" (UID: \"3fa23ecb-011f-4af6-ae90-66bd39889f28\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.384424 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.384330 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwlm\" (UniqueName: \"kubernetes.io/projected/3fa23ecb-011f-4af6-ae90-66bd39889f28-kube-api-access-jdwlm\") pod \"llmisvc-controller-manager-68cc5db7c4-42hbz\" (UID: \"3fa23ecb-011f-4af6-ae90-66bd39889f28\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.384424 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.384384 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fa23ecb-011f-4af6-ae90-66bd39889f28-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-42hbz\" (UID: \"3fa23ecb-011f-4af6-ae90-66bd39889f28\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.386768 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.386742 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3fa23ecb-011f-4af6-ae90-66bd39889f28-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-42hbz\" (UID: \"3fa23ecb-011f-4af6-ae90-66bd39889f28\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.394079 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.394056 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwlm\" (UniqueName: \"kubernetes.io/projected/3fa23ecb-011f-4af6-ae90-66bd39889f28-kube-api-access-jdwlm\") pod \"llmisvc-controller-manager-68cc5db7c4-42hbz\" (UID: \"3fa23ecb-011f-4af6-ae90-66bd39889f28\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.412983 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.412959 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:20:29.546095 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.546039 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-42hbz"] Apr 16 22:20:29.549602 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:20:29.549561 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3fa23ecb_011f_4af6_ae90_66bd39889f28.slice/crio-917adaf0d5bfa7442a8ba04f6eeeae395c71c2db1666978e1977e22daf1f236b WatchSource:0}: Error finding container 917adaf0d5bfa7442a8ba04f6eeeae395c71c2db1666978e1977e22daf1f236b: Status 404 returned error can't find the container with id 917adaf0d5bfa7442a8ba04f6eeeae395c71c2db1666978e1977e22daf1f236b Apr 16 22:20:29.550844 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.550829 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:20:29.943448 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:29.943414 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" event={"ID":"3fa23ecb-011f-4af6-ae90-66bd39889f28","Type":"ContainerStarted","Data":"917adaf0d5bfa7442a8ba04f6eeeae395c71c2db1666978e1977e22daf1f236b"} Apr 16 22:20:31.949891 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:31.949847 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" event={"ID":"3fa23ecb-011f-4af6-ae90-66bd39889f28","Type":"ContainerStarted","Data":"532ffd70e6b23dc59a5c341240debec1b27a184ac3255fb59a839899200a76b3"} Apr 16 22:20:31.969118 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:31.969071 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" podStartSLOduration=1.14396707 podStartE2EDuration="2.969057431s" podCreationTimestamp="2026-04-16 22:20:29 +0000 UTC" firstStartedPulling="2026-04-16 22:20:29.5509684 +0000 UTC m=+404.359221530" lastFinishedPulling="2026-04-16 22:20:31.376058757 +0000 UTC m=+406.184311891" observedRunningTime="2026-04-16 22:20:31.96846331 +0000 UTC m=+406.776716463" watchObservedRunningTime="2026-04-16 22:20:31.969057431 +0000 UTC m=+406.777310583" Apr 16 22:20:32.952261 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:20:32.952224 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:21:03.957724 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:03.957649 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-42hbz" Apr 16 22:21:57.971961 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.971924 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dbdc49d55-ttn49"] Apr 16 22:21:57.974900 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.974864 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:57.983149 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.983129 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:21:57.983243 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.983134 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:21:57.983243 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.983171 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:21:57.984223 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.984205 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:21:57.984314 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.984217 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:21:57.984314 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.984297 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:21:57.985323 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.985308 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7vb78\"" Apr 16 22:21:57.986648 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.986555 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:21:57.988894 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.988849 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbdc49d55-ttn49"] Apr 16 22:21:57.989047 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:57.988894 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:21:58.107432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107401 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-config\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.107432 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107436 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-service-ca\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.107627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107467 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-trusted-ca-bundle\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.107627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107517 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-oauth-serving-cert\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.107627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107564 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-serving-cert\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.107627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107581 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5n62\" (UniqueName: \"kubernetes.io/projected/c04f52de-8bf1-49c0-b49d-5142d84557e0-kube-api-access-z5n62\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.107627 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.107614 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-oauth-config\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208527 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208492 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-trusted-ca-bundle\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208527 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208526 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-oauth-serving-cert\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208773 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208552 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-serving-cert\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208773 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208568 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5n62\" (UniqueName: \"kubernetes.io/projected/c04f52de-8bf1-49c0-b49d-5142d84557e0-kube-api-access-z5n62\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208773 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208592 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-oauth-config\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208773 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208626 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-config\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.208773 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.208641 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-service-ca\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.209480 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.209449 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-oauth-serving-cert\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.209480 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.209471 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-service-ca\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.209739 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.209514 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-config\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.209739 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.209709 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c04f52de-8bf1-49c0-b49d-5142d84557e0-trusted-ca-bundle\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.211021 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.211001 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-oauth-config\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.211115 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.211103 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c04f52de-8bf1-49c0-b49d-5142d84557e0-console-serving-cert\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.217282 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.217261 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5n62\" (UniqueName: \"kubernetes.io/projected/c04f52de-8bf1-49c0-b49d-5142d84557e0-kube-api-access-z5n62\") pod \"console-6dbdc49d55-ttn49\" (UID: \"c04f52de-8bf1-49c0-b49d-5142d84557e0\") " pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.283823 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.283752 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:21:58.397228 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:58.397199 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbdc49d55-ttn49"] Apr 16 22:21:58.400106 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:21:58.400077 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04f52de_8bf1_49c0_b49d_5142d84557e0.slice/crio-bcfa6bc203eec2505c5a56460a4fc8197e95e1039a221cd9f461cca261e43e19 WatchSource:0}: Error finding container bcfa6bc203eec2505c5a56460a4fc8197e95e1039a221cd9f461cca261e43e19: Status 404 returned error can't find the container with id bcfa6bc203eec2505c5a56460a4fc8197e95e1039a221cd9f461cca261e43e19 Apr 16 22:21:59.178052 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:59.178013 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbdc49d55-ttn49" event={"ID":"c04f52de-8bf1-49c0-b49d-5142d84557e0","Type":"ContainerStarted","Data":"b11e09dc89a87d86bca68e1b55081e9e957fc49c8045ab666cd2e020a663ce49"} Apr 16 22:21:59.178052 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:59.178049 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbdc49d55-ttn49" event={"ID":"c04f52de-8bf1-49c0-b49d-5142d84557e0","Type":"ContainerStarted","Data":"bcfa6bc203eec2505c5a56460a4fc8197e95e1039a221cd9f461cca261e43e19"} Apr 16 22:21:59.199139 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:21:59.197232 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dbdc49d55-ttn49" podStartSLOduration=2.197214605 podStartE2EDuration="2.197214605s" podCreationTimestamp="2026-04-16 22:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:21:59.195500951 +0000 UTC m=+494.003754103" watchObservedRunningTime="2026-04-16 22:21:59.197214605 +0000 UTC m=+494.005467758" Apr 16 22:22:08.284049 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:22:08.284020 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:22:08.284484 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:22:08.284092 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:22:08.288442 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:22:08.288416 2560 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:22:09.209222 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:22:09.209196 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dbdc49d55-ttn49" Apr 16 22:23:45.673969 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:45.673940 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:23:45.675065 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:45.675040 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:23:59.919392 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.919300 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj"] Apr 16 22:23:59.922616 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.922592 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:23:59.925610 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.925587 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:23:59.925730 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.925643 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gj7r7\"" Apr 16 22:23:59.925730 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.925680 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:23:59.925730 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.925592 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\"" Apr 16 22:23:59.926656 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.926642 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-b6a6b-predictor-serving-cert\"" Apr 16 22:23:59.936253 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:23:59.936233 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj"] Apr 16 22:24:00.105310 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.105279 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d27c76-9ac6-45db-addc-c60d9e9f3109-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.105310 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.105319 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dth4\" (UniqueName: \"kubernetes.io/projected/25d27c76-9ac6-45db-addc-c60d9e9f3109-kube-api-access-4dth4\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.105527 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.105406 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d27c76-9ac6-45db-addc-c60d9e9f3109-isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.105527 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.105452 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d27c76-9ac6-45db-addc-c60d9e9f3109-proxy-tls\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.206703 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.206624 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dth4\" (UniqueName: \"kubernetes.io/projected/25d27c76-9ac6-45db-addc-c60d9e9f3109-kube-api-access-4dth4\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.206703 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.206679 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d27c76-9ac6-45db-addc-c60d9e9f3109-isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.206860 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.206710 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d27c76-9ac6-45db-addc-c60d9e9f3109-proxy-tls\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.206860 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.206742 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d27c76-9ac6-45db-addc-c60d9e9f3109-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.207232 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.207211 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d27c76-9ac6-45db-addc-c60d9e9f3109-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.207389 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.207368 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d27c76-9ac6-45db-addc-c60d9e9f3109-isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.209231 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.209214 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d27c76-9ac6-45db-addc-c60d9e9f3109-proxy-tls\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.215815 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.215794 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dth4\" (UniqueName: \"kubernetes.io/projected/25d27c76-9ac6-45db-addc-c60d9e9f3109-kube-api-access-4dth4\") pod \"isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.232351 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.232328 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:00.355758 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.355724 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj"] Apr 16 22:24:00.359025 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:24:00.358998 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d27c76_9ac6_45db_addc_c60d9e9f3109.slice/crio-f12a8db4c0d962a95b3bee3c6c0329818960d79f3ffd9e49b3013fe3eba6d69c WatchSource:0}: Error finding container f12a8db4c0d962a95b3bee3c6c0329818960d79f3ffd9e49b3013fe3eba6d69c: Status 404 returned error can't find the container with id f12a8db4c0d962a95b3bee3c6c0329818960d79f3ffd9e49b3013fe3eba6d69c Apr 16 22:24:00.504277 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:00.504185 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerStarted","Data":"f12a8db4c0d962a95b3bee3c6c0329818960d79f3ffd9e49b3013fe3eba6d69c"} Apr 16 22:24:04.519758 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:04.519719 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerStarted","Data":"c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e"} Apr 16 22:24:07.528042 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:07.528006 2560 generic.go:358] "Generic (PLEG): container finished" podID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerID="c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e" exitCode=0 Apr 16 22:24:07.528403 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:07.528051 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerDied","Data":"c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e"} Apr 16 22:24:24.577506 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:24.577472 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerStarted","Data":"96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff"} Apr 16 22:24:26.585884 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:26.585835 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerStarted","Data":"4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522"} Apr 16 22:24:26.586240 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:26.585998 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:26.604642 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:26.604589 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podStartSLOduration=1.488156762 podStartE2EDuration="27.604573873s" podCreationTimestamp="2026-04-16 22:23:59 +0000 UTC" firstStartedPulling="2026-04-16 22:24:00.360719365 +0000 UTC m=+615.168972496" lastFinishedPulling="2026-04-16 22:24:26.477136464 +0000 UTC m=+641.285389607" observedRunningTime="2026-04-16 22:24:26.603416329 +0000 UTC m=+641.411669481" watchObservedRunningTime="2026-04-16 22:24:26.604573873 +0000 UTC m=+641.412827025" Apr 16 22:24:27.589219 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:27.589189 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:27.590446 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:27.590420 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:24:28.591556 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:28.591512 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:24:33.596120 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:33.596088 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:24:33.596596 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:33.596571 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:24:43.596775 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:43.596727 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:24:53.596757 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:24:53.596713 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:25:03.596564 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:03.596524 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:25:13.597470 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:13.597431 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:25:23.596536 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:23.596493 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:25:33.597116 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:33.597044 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:25:50.251092 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.251049 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn"] Apr 16 22:25:50.254414 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.254389 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.256659 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.256638 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-23b0a-predictor-serving-cert\"" Apr 16 22:25:50.256659 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.256647 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\"" Apr 16 22:25:50.262184 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.262160 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn"] Apr 16 22:25:50.331955 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.331920 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj"] Apr 16 22:25:50.332393 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.332332 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" containerID="cri-o://96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff" gracePeriod=30 Apr 16 22:25:50.332496 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.332406 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kube-rbac-proxy" containerID="cri-o://4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522" gracePeriod=30 Apr 16 22:25:50.380472 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.380427 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/067b738d-d34b-418d-8950-9d9a810eccae-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.380646 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.380489 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d8gp\" (UniqueName: \"kubernetes.io/projected/067b738d-d34b-418d-8950-9d9a810eccae-kube-api-access-7d8gp\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.380646 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.380591 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/067b738d-d34b-418d-8950-9d9a810eccae-isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.380646 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.380630 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067b738d-d34b-418d-8950-9d9a810eccae-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.481090 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.481056 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d8gp\" (UniqueName: \"kubernetes.io/projected/067b738d-d34b-418d-8950-9d9a810eccae-kube-api-access-7d8gp\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.481271 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.481129 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/067b738d-d34b-418d-8950-9d9a810eccae-isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.481271 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.481153 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067b738d-d34b-418d-8950-9d9a810eccae-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.481271 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.481177 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/067b738d-d34b-418d-8950-9d9a810eccae-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.481609 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.481582 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067b738d-d34b-418d-8950-9d9a810eccae-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.481807 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.481786 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/067b738d-d34b-418d-8950-9d9a810eccae-isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.483636 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.483614 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/067b738d-d34b-418d-8950-9d9a810eccae-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.491431 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.491403 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d8gp\" (UniqueName: \"kubernetes.io/projected/067b738d-d34b-418d-8950-9d9a810eccae-kube-api-access-7d8gp\") pod \"isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.564938 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.564842 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:50.683679 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.683646 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn"] Apr 16 22:25:50.686517 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:25:50.686485 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod067b738d_d34b_418d_8950_9d9a810eccae.slice/crio-f571114be2bf641829c1f057f7359d120fba3077c44e8f22d0a31f615f09f837 WatchSource:0}: Error finding container f571114be2bf641829c1f057f7359d120fba3077c44e8f22d0a31f615f09f837: Status 404 returned error can't find the container with id f571114be2bf641829c1f057f7359d120fba3077c44e8f22d0a31f615f09f837 Apr 16 22:25:50.688388 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.688369 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:25:50.815598 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.815512 2560 generic.go:358] "Generic (PLEG): container finished" podID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerID="4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522" exitCode=2 Apr 16 22:25:50.815598 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.815585 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerDied","Data":"4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522"} Apr 16 22:25:50.819458 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.819424 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerStarted","Data":"f5db8ad71f626f7aa2637fb503d5d86daed314aa6dfd3589d5a46e6bf7a8b475"} Apr 16 22:25:50.819458 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:50.819457 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerStarted","Data":"f571114be2bf641829c1f057f7359d120fba3077c44e8f22d0a31f615f09f837"} Apr 16 22:25:53.592002 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:53.591951 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.16:8643/healthz\": dial tcp 10.133.0.16:8643: connect: connection refused" Apr 16 22:25:53.597386 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:53.597352 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 16 22:25:54.070687 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.070665 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:25:54.211844 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.211750 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d27c76-9ac6-45db-addc-c60d9e9f3109-kserve-provision-location\") pod \"25d27c76-9ac6-45db-addc-c60d9e9f3109\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " Apr 16 22:25:54.211844 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.211840 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d27c76-9ac6-45db-addc-c60d9e9f3109-proxy-tls\") pod \"25d27c76-9ac6-45db-addc-c60d9e9f3109\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " Apr 16 22:25:54.212044 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.211897 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dth4\" (UniqueName: \"kubernetes.io/projected/25d27c76-9ac6-45db-addc-c60d9e9f3109-kube-api-access-4dth4\") pod \"25d27c76-9ac6-45db-addc-c60d9e9f3109\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " Apr 16 22:25:54.212044 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.212029 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d27c76-9ac6-45db-addc-c60d9e9f3109-isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"25d27c76-9ac6-45db-addc-c60d9e9f3109\" (UID: \"25d27c76-9ac6-45db-addc-c60d9e9f3109\") " Apr 16 22:25:54.212158 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.212109 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d27c76-9ac6-45db-addc-c60d9e9f3109-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "25d27c76-9ac6-45db-addc-c60d9e9f3109" (UID: "25d27c76-9ac6-45db-addc-c60d9e9f3109"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:25:54.212319 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.212248 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25d27c76-9ac6-45db-addc-c60d9e9f3109-kserve-provision-location\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.212424 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.212402 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d27c76-9ac6-45db-addc-c60d9e9f3109-isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config") pod "25d27c76-9ac6-45db-addc-c60d9e9f3109" (UID: "25d27c76-9ac6-45db-addc-c60d9e9f3109"). InnerVolumeSpecName "isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:54.214111 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.214087 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d27c76-9ac6-45db-addc-c60d9e9f3109-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "25d27c76-9ac6-45db-addc-c60d9e9f3109" (UID: "25d27c76-9ac6-45db-addc-c60d9e9f3109"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:54.214111 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.214091 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d27c76-9ac6-45db-addc-c60d9e9f3109-kube-api-access-4dth4" (OuterVolumeSpecName: "kube-api-access-4dth4") pod "25d27c76-9ac6-45db-addc-c60d9e9f3109" (UID: "25d27c76-9ac6-45db-addc-c60d9e9f3109"). InnerVolumeSpecName "kube-api-access-4dth4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:54.313099 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.313060 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d27c76-9ac6-45db-addc-c60d9e9f3109-proxy-tls\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.313099 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.313092 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dth4\" (UniqueName: \"kubernetes.io/projected/25d27c76-9ac6-45db-addc-c60d9e9f3109-kube-api-access-4dth4\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.313099 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.313102 2560 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/25d27c76-9ac6-45db-addc-c60d9e9f3109-isvc-xgboost-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.832577 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.832543 2560 generic.go:358] "Generic (PLEG): container finished" podID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerID="96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff" exitCode=0 Apr 16 22:25:54.833030 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.832616 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerDied","Data":"96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff"} Apr 16 22:25:54.833030 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.832622 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" Apr 16 22:25:54.833030 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.832653 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj" event={"ID":"25d27c76-9ac6-45db-addc-c60d9e9f3109","Type":"ContainerDied","Data":"f12a8db4c0d962a95b3bee3c6c0329818960d79f3ffd9e49b3013fe3eba6d69c"} Apr 16 22:25:54.833030 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.832676 2560 scope.go:117] "RemoveContainer" containerID="4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522" Apr 16 22:25:54.834150 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.834125 2560 generic.go:358] "Generic (PLEG): container finished" podID="067b738d-d34b-418d-8950-9d9a810eccae" containerID="f5db8ad71f626f7aa2637fb503d5d86daed314aa6dfd3589d5a46e6bf7a8b475" exitCode=0 Apr 16 22:25:54.834292 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.834202 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerDied","Data":"f5db8ad71f626f7aa2637fb503d5d86daed314aa6dfd3589d5a46e6bf7a8b475"} Apr 16 22:25:54.840979 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.840832 2560 scope.go:117] "RemoveContainer" containerID="96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff" Apr 16 22:25:54.847967 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.847945 2560 scope.go:117] "RemoveContainer" containerID="c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e" Apr 16 22:25:54.854855 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.854839 2560 scope.go:117] "RemoveContainer" containerID="4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522" Apr 16 22:25:54.855103 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:25:54.855083 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522\": container with ID starting with 4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522 not found: ID does not exist" containerID="4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522" Apr 16 22:25:54.855168 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.855110 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522"} err="failed to get container status \"4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522\": rpc error: code = NotFound desc = could not find container \"4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522\": container with ID starting with 4052c95bcf4c3c4d76b510daa5c12547fe78e271a7e15315192967620466c522 not found: ID does not exist" Apr 16 22:25:54.855168 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.855125 2560 scope.go:117] "RemoveContainer" containerID="96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff" Apr 16 22:25:54.855378 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:25:54.855358 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff\": container with ID starting with 96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff not found: ID does not exist" containerID="96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff" Apr 16 22:25:54.855441 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.855384 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff"} err="failed to get container status \"96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff\": rpc error: code = NotFound desc = could not find container \"96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff\": container with ID starting with 96eef13c88eac82ba2a950e494a965c8921a47b3c24822170915c039462661ff not found: ID does not exist" Apr 16 22:25:54.855441 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.855401 2560 scope.go:117] "RemoveContainer" containerID="c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e" Apr 16 22:25:54.855616 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:25:54.855601 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e\": container with ID starting with c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e not found: ID does not exist" containerID="c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e" Apr 16 22:25:54.855667 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.855621 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e"} err="failed to get container status \"c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e\": rpc error: code = NotFound desc = could not find container \"c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e\": container with ID starting with c1dd62e7302cfeb4a15e959615d9a5eafd5655b1b51c60b24779153852ffd26e not found: ID does not exist" Apr 16 22:25:54.897393 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.897372 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj"] Apr 16 22:25:54.900857 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:54.900836 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-b6a6b-predictor-66b4f8fc7-446vj"] Apr 16 22:25:55.776704 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:55.776664 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" path="/var/lib/kubelet/pods/25d27c76-9ac6-45db-addc-c60d9e9f3109/volumes" Apr 16 22:25:55.838955 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:55.838922 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerStarted","Data":"c7c426dc091216765a1a4dd918163532ecfccdb2259e0e44f42dabca55c8e00c"} Apr 16 22:25:55.839335 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:55.838960 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerStarted","Data":"597e1d41f1ee51a8b9e9628a6b3f4955764c967c4ca1429a632d4b25f96875c3"} Apr 16 22:25:55.839335 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:55.839174 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:55.857135 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:55.857087 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podStartSLOduration=5.857072364 podStartE2EDuration="5.857072364s" podCreationTimestamp="2026-04-16 22:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:55.856013745 +0000 UTC m=+730.664266897" watchObservedRunningTime="2026-04-16 22:25:55.857072364 +0000 UTC m=+730.665325494" Apr 16 22:25:56.842326 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:56.842290 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:25:56.843628 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:56.843595 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:25:57.845795 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:25:57.845750 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:26:02.850345 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:02.850314 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:26:02.850817 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:02.850791 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:26:12.850813 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:12.850769 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:26:22.851360 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:22.851315 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:26:32.851015 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:32.850961 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:26:42.850795 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:42.850754 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:26:52.851263 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:26:52.851220 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:27:02.851642 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:02.851561 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:27:30.429080 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429042 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv"] Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429423 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429442 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429465 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="storage-initializer" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429473 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="storage-initializer" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429493 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kube-rbac-proxy" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429502 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kube-rbac-proxy" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429580 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kserve-container" Apr 16 22:27:30.429705 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.429594 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="25d27c76-9ac6-45db-addc-c60d9e9f3109" containerName="kube-rbac-proxy" Apr 16 22:27:30.432723 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.432702 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.435853 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.435827 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-1bd3d-predictor-serving-cert\"" Apr 16 22:27:30.435853 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.435835 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\"" Apr 16 22:27:30.441750 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.441726 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv"] Apr 16 22:27:30.470264 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.470234 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn"] Apr 16 22:27:30.470602 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.470559 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" containerID="cri-o://597e1d41f1ee51a8b9e9628a6b3f4955764c967c4ca1429a632d4b25f96875c3" gracePeriod=30 Apr 16 22:27:30.470666 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.470632 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kube-rbac-proxy" containerID="cri-o://c7c426dc091216765a1a4dd918163532ecfccdb2259e0e44f42dabca55c8e00c" gracePeriod=30 Apr 16 22:27:30.554768 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.554727 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a23b60f-2b38-46e1-acc5-6a715cc3966d-message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.554958 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.554791 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.554958 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.554903 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc75x\" (UniqueName: \"kubernetes.io/projected/9a23b60f-2b38-46e1-acc5-6a715cc3966d-kube-api-access-wc75x\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.655653 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.655613 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.655818 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.655688 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc75x\" (UniqueName: \"kubernetes.io/projected/9a23b60f-2b38-46e1-acc5-6a715cc3966d-kube-api-access-wc75x\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.655818 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.655752 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a23b60f-2b38-46e1-acc5-6a715cc3966d-message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.655818 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:27:30.655775 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-serving-cert: secret "message-dumper-raw-1bd3d-predictor-serving-cert" not found Apr 16 22:27:30.655980 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:27:30.655857 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls podName:9a23b60f-2b38-46e1-acc5-6a715cc3966d nodeName:}" failed. No retries permitted until 2026-04-16 22:27:31.155834338 +0000 UTC m=+825.964087468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls") pod "message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" (UID: "9a23b60f-2b38-46e1-acc5-6a715cc3966d") : secret "message-dumper-raw-1bd3d-predictor-serving-cert" not found Apr 16 22:27:30.656378 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.656358 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a23b60f-2b38-46e1-acc5-6a715cc3966d-message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:30.665170 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:30.665138 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc75x\" (UniqueName: \"kubernetes.io/projected/9a23b60f-2b38-46e1-acc5-6a715cc3966d-kube-api-access-wc75x\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:31.094679 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:31.094644 2560 generic.go:358] "Generic (PLEG): container finished" podID="067b738d-d34b-418d-8950-9d9a810eccae" containerID="c7c426dc091216765a1a4dd918163532ecfccdb2259e0e44f42dabca55c8e00c" exitCode=2 Apr 16 22:27:31.094679 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:31.094682 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerDied","Data":"c7c426dc091216765a1a4dd918163532ecfccdb2259e0e44f42dabca55c8e00c"} Apr 16 22:27:31.160198 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:31.160159 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:31.160381 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:27:31.160337 2560 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-serving-cert: secret "message-dumper-raw-1bd3d-predictor-serving-cert" not found Apr 16 22:27:31.160445 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:27:31.160424 2560 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls podName:9a23b60f-2b38-46e1-acc5-6a715cc3966d nodeName:}" failed. No retries permitted until 2026-04-16 22:27:32.160398684 +0000 UTC m=+826.968651817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls") pod "message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" (UID: "9a23b60f-2b38-46e1-acc5-6a715cc3966d") : secret "message-dumper-raw-1bd3d-predictor-serving-cert" not found Apr 16 22:27:32.167568 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:32.167525 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:32.170007 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:32.169984 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") pod \"message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:32.242730 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:32.242680 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:32.361156 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:32.361123 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv"] Apr 16 22:27:32.366232 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:27:32.366204 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a23b60f_2b38_46e1_acc5_6a715cc3966d.slice/crio-b88b02fec0fa730291c178a424ee7908b261fcd62cbee172dad9282f73856223 WatchSource:0}: Error finding container b88b02fec0fa730291c178a424ee7908b261fcd62cbee172dad9282f73856223: Status 404 returned error can't find the container with id b88b02fec0fa730291c178a424ee7908b261fcd62cbee172dad9282f73856223 Apr 16 22:27:32.846510 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:32.846473 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 16 22:27:32.850771 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:32.850741 2560 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 16 22:27:33.101700 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:33.101630 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" event={"ID":"9a23b60f-2b38-46e1-acc5-6a715cc3966d","Type":"ContainerStarted","Data":"b88b02fec0fa730291c178a424ee7908b261fcd62cbee172dad9282f73856223"} Apr 16 22:27:34.106897 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.106854 2560 generic.go:358] "Generic (PLEG): container finished" podID="067b738d-d34b-418d-8950-9d9a810eccae" containerID="597e1d41f1ee51a8b9e9628a6b3f4955764c967c4ca1429a632d4b25f96875c3" exitCode=0 Apr 16 22:27:34.107233 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.106913 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerDied","Data":"597e1d41f1ee51a8b9e9628a6b3f4955764c967c4ca1429a632d4b25f96875c3"} Apr 16 22:27:34.108639 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.108607 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" event={"ID":"9a23b60f-2b38-46e1-acc5-6a715cc3966d","Type":"ContainerStarted","Data":"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5"} Apr 16 22:27:34.108639 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.108635 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" event={"ID":"9a23b60f-2b38-46e1-acc5-6a715cc3966d","Type":"ContainerStarted","Data":"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8"} Apr 16 22:27:34.108774 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.108755 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:34.126495 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.126449 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" podStartSLOduration=3.070853845 podStartE2EDuration="4.126435351s" podCreationTimestamp="2026-04-16 22:27:30 +0000 UTC" firstStartedPulling="2026-04-16 22:27:32.367835022 +0000 UTC m=+827.176088152" lastFinishedPulling="2026-04-16 22:27:33.423416525 +0000 UTC m=+828.231669658" observedRunningTime="2026-04-16 22:27:34.125223858 +0000 UTC m=+828.933477011" watchObservedRunningTime="2026-04-16 22:27:34.126435351 +0000 UTC m=+828.934688503" Apr 16 22:27:34.209703 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.209682 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:27:34.387195 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.387161 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/067b738d-d34b-418d-8950-9d9a810eccae-isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"067b738d-d34b-418d-8950-9d9a810eccae\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " Apr 16 22:27:34.387384 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.387251 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/067b738d-d34b-418d-8950-9d9a810eccae-proxy-tls\") pod \"067b738d-d34b-418d-8950-9d9a810eccae\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " Apr 16 22:27:34.387384 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.387286 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d8gp\" (UniqueName: \"kubernetes.io/projected/067b738d-d34b-418d-8950-9d9a810eccae-kube-api-access-7d8gp\") pod \"067b738d-d34b-418d-8950-9d9a810eccae\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " Apr 16 22:27:34.387384 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.387308 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067b738d-d34b-418d-8950-9d9a810eccae-kserve-provision-location\") pod \"067b738d-d34b-418d-8950-9d9a810eccae\" (UID: \"067b738d-d34b-418d-8950-9d9a810eccae\") " Apr 16 22:27:34.387552 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.387517 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067b738d-d34b-418d-8950-9d9a810eccae-isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config") pod "067b738d-d34b-418d-8950-9d9a810eccae" (UID: "067b738d-d34b-418d-8950-9d9a810eccae"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:27:34.387680 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.387657 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067b738d-d34b-418d-8950-9d9a810eccae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "067b738d-d34b-418d-8950-9d9a810eccae" (UID: "067b738d-d34b-418d-8950-9d9a810eccae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:34.389420 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.389397 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067b738d-d34b-418d-8950-9d9a810eccae-kube-api-access-7d8gp" (OuterVolumeSpecName: "kube-api-access-7d8gp") pod "067b738d-d34b-418d-8950-9d9a810eccae" (UID: "067b738d-d34b-418d-8950-9d9a810eccae"). InnerVolumeSpecName "kube-api-access-7d8gp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:27:34.389420 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.389408 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067b738d-d34b-418d-8950-9d9a810eccae-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "067b738d-d34b-418d-8950-9d9a810eccae" (UID: "067b738d-d34b-418d-8950-9d9a810eccae"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:27:34.487843 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.487804 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7d8gp\" (UniqueName: \"kubernetes.io/projected/067b738d-d34b-418d-8950-9d9a810eccae-kube-api-access-7d8gp\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:27:34.487843 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.487836 2560 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067b738d-d34b-418d-8950-9d9a810eccae-kserve-provision-location\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:27:34.487843 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.487846 2560 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/067b738d-d34b-418d-8950-9d9a810eccae-isvc-xgboost-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:27:34.488102 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:34.487858 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/067b738d-d34b-418d-8950-9d9a810eccae-proxy-tls\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:27:35.112841 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.112796 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" event={"ID":"067b738d-d34b-418d-8950-9d9a810eccae","Type":"ContainerDied","Data":"f571114be2bf641829c1f057f7359d120fba3077c44e8f22d0a31f615f09f837"} Apr 16 22:27:35.113334 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.112809 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn" Apr 16 22:27:35.113334 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.112852 2560 scope.go:117] "RemoveContainer" containerID="c7c426dc091216765a1a4dd918163532ecfccdb2259e0e44f42dabca55c8e00c" Apr 16 22:27:35.113334 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.113316 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:35.115108 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.115085 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:27:35.121112 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.121091 2560 scope.go:117] "RemoveContainer" containerID="597e1d41f1ee51a8b9e9628a6b3f4955764c967c4ca1429a632d4b25f96875c3" Apr 16 22:27:35.128196 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.128174 2560 scope.go:117] "RemoveContainer" containerID="f5db8ad71f626f7aa2637fb503d5d86daed314aa6dfd3589d5a46e6bf7a8b475" Apr 16 22:27:35.150930 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.150906 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn"] Apr 16 22:27:35.154097 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.154079 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-23b0a-predictor-79b467f6df-wwfnn"] Apr 16 22:27:35.777743 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:35.777708 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067b738d-d34b-418d-8950-9d9a810eccae" path="/var/lib/kubelet/pods/067b738d-d34b-418d-8950-9d9a810eccae/volumes" Apr 16 22:27:42.123377 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:27:42.123346 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:28:45.693403 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:28:45.693326 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:28:45.693403 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:28:45.693326 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:29:15.502175 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:15.502148 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv_9a23b60f-2b38-46e1-acc5-6a715cc3966d/kserve-container/0.log" Apr 16 22:29:15.792508 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:15.792431 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv"] Apr 16 22:29:15.792727 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:15.792695 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kserve-container" containerID="cri-o://121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8" gracePeriod=30 Apr 16 22:29:15.792922 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:15.792732 2560 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kube-rbac-proxy" containerID="cri-o://1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5" gracePeriod=30 Apr 16 22:29:16.028904 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.028866 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:29:16.050201 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.050136 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") pod \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " Apr 16 22:29:16.050201 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.050180 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a23b60f-2b38-46e1-acc5-6a715cc3966d-message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " Apr 16 22:29:16.050379 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.050228 2560 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc75x\" (UniqueName: \"kubernetes.io/projected/9a23b60f-2b38-46e1-acc5-6a715cc3966d-kube-api-access-wc75x\") pod \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\" (UID: \"9a23b60f-2b38-46e1-acc5-6a715cc3966d\") " Apr 16 22:29:16.050542 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.050513 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a23b60f-2b38-46e1-acc5-6a715cc3966d-message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config") pod "9a23b60f-2b38-46e1-acc5-6a715cc3966d" (UID: "9a23b60f-2b38-46e1-acc5-6a715cc3966d"). InnerVolumeSpecName "message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:29:16.052292 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.052263 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a23b60f-2b38-46e1-acc5-6a715cc3966d" (UID: "9a23b60f-2b38-46e1-acc5-6a715cc3966d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:29:16.052372 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.052320 2560 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a23b60f-2b38-46e1-acc5-6a715cc3966d-kube-api-access-wc75x" (OuterVolumeSpecName: "kube-api-access-wc75x") pod "9a23b60f-2b38-46e1-acc5-6a715cc3966d" (UID: "9a23b60f-2b38-46e1-acc5-6a715cc3966d"). InnerVolumeSpecName "kube-api-access-wc75x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:29:16.151137 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.151090 2560 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wc75x\" (UniqueName: \"kubernetes.io/projected/9a23b60f-2b38-46e1-acc5-6a715cc3966d-kube-api-access-wc75x\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:29:16.151137 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.151130 2560 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a23b60f-2b38-46e1-acc5-6a715cc3966d-proxy-tls\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:29:16.151137 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.151145 2560 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a23b60f-2b38-46e1-acc5-6a715cc3966d-message-dumper-raw-1bd3d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-35.ec2.internal\" DevicePath \"\"" Apr 16 22:29:16.386069 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386030 2560 generic.go:358] "Generic (PLEG): container finished" podID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerID="1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5" exitCode=2 Apr 16 22:29:16.386069 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386062 2560 generic.go:358] "Generic (PLEG): container finished" podID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerID="121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8" exitCode=2 Apr 16 22:29:16.386279 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386110 2560 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" Apr 16 22:29:16.386279 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386108 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" event={"ID":"9a23b60f-2b38-46e1-acc5-6a715cc3966d","Type":"ContainerDied","Data":"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5"} Apr 16 22:29:16.386279 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386149 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" event={"ID":"9a23b60f-2b38-46e1-acc5-6a715cc3966d","Type":"ContainerDied","Data":"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8"} Apr 16 22:29:16.386279 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386162 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv" event={"ID":"9a23b60f-2b38-46e1-acc5-6a715cc3966d","Type":"ContainerDied","Data":"b88b02fec0fa730291c178a424ee7908b261fcd62cbee172dad9282f73856223"} Apr 16 22:29:16.386279 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.386177 2560 scope.go:117] "RemoveContainer" containerID="1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5" Apr 16 22:29:16.393800 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.393777 2560 scope.go:117] "RemoveContainer" containerID="121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8" Apr 16 22:29:16.400712 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.400692 2560 scope.go:117] "RemoveContainer" containerID="1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5" Apr 16 22:29:16.401059 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:29:16.401039 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5\": container with ID starting with 1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5 not found: ID does not exist" containerID="1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5" Apr 16 22:29:16.401109 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401067 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5"} err="failed to get container status \"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5\": rpc error: code = NotFound desc = could not find container \"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5\": container with ID starting with 1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5 not found: ID does not exist" Apr 16 22:29:16.401109 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401085 2560 scope.go:117] "RemoveContainer" containerID="121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8" Apr 16 22:29:16.401332 ip-10-0-142-35 kubenswrapper[2560]: E0416 22:29:16.401314 2560 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8\": container with ID starting with 121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8 not found: ID does not exist" containerID="121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8" Apr 16 22:29:16.401380 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401338 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8"} err="failed to get container status \"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8\": rpc error: code = NotFound desc = could not find container \"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8\": container with ID starting with 121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8 not found: ID does not exist" Apr 16 22:29:16.401380 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401352 2560 scope.go:117] "RemoveContainer" containerID="1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5" Apr 16 22:29:16.401546 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401528 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5"} err="failed to get container status \"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5\": rpc error: code = NotFound desc = could not find container \"1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5\": container with ID starting with 1b0776ed9dc560f53bbbc085b69368ab0c837dc0de6818a27dd2e178ee2fb5e5 not found: ID does not exist" Apr 16 22:29:16.401599 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401546 2560 scope.go:117] "RemoveContainer" containerID="121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8" Apr 16 22:29:16.401734 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.401718 2560 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8"} err="failed to get container status \"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8\": rpc error: code = NotFound desc = could not find container \"121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8\": container with ID starting with 121d017ca1ab6c42e9f67e0a088d1b1d866deaa84f1d994c1df39883d18086d8 not found: ID does not exist" Apr 16 22:29:16.406491 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.406470 2560 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv"] Apr 16 22:29:16.411032 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:16.411014 2560 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-1bd3d-predictor-5466bd4584-gjbrv"] Apr 16 22:29:17.776372 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:29:17.776336 2560 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" path="/var/lib/kubelet/pods/9a23b60f-2b38-46e1-acc5-6a715cc3966d/volumes" Apr 16 22:33:45.710985 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:33:45.710955 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:33:45.712305 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:33:45.712284 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:36:35.427340 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427299 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wls74/must-gather-d2kn2"] Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427576 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kube-rbac-proxy" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427589 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kube-rbac-proxy" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427602 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427607 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427619 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kserve-container" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427626 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kserve-container" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427635 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="storage-initializer" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427640 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="storage-initializer" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427646 2560 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kube-rbac-proxy" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427651 2560 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kube-rbac-proxy" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427692 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kube-rbac-proxy" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427700 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kserve-container" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427708 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="067b738d-d34b-418d-8950-9d9a810eccae" containerName="kube-rbac-proxy" Apr 16 22:36:35.427766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.427713 2560 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a23b60f-2b38-46e1-acc5-6a715cc3966d" containerName="kserve-container" Apr 16 22:36:35.430580 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.430562 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.433153 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.433124 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wls74\"/\"kube-root-ca.crt\"" Apr 16 22:36:35.433286 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.433124 2560 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wls74\"/\"openshift-service-ca.crt\"" Apr 16 22:36:35.434111 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.434097 2560 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wls74\"/\"default-dockercfg-zkhg7\"" Apr 16 22:36:35.436764 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.436740 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/must-gather-d2kn2"] Apr 16 22:36:35.595928 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.595889 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/827e70a9-9b81-4017-8441-358aac34df13-must-gather-output\") pod \"must-gather-d2kn2\" (UID: \"827e70a9-9b81-4017-8441-358aac34df13\") " pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.596099 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.595940 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclqr\" (UniqueName: \"kubernetes.io/projected/827e70a9-9b81-4017-8441-358aac34df13-kube-api-access-hclqr\") pod \"must-gather-d2kn2\" (UID: \"827e70a9-9b81-4017-8441-358aac34df13\") " pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.696517 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.696447 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/827e70a9-9b81-4017-8441-358aac34df13-must-gather-output\") pod \"must-gather-d2kn2\" (UID: \"827e70a9-9b81-4017-8441-358aac34df13\") " pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.696517 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.696488 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hclqr\" (UniqueName: \"kubernetes.io/projected/827e70a9-9b81-4017-8441-358aac34df13-kube-api-access-hclqr\") pod \"must-gather-d2kn2\" (UID: \"827e70a9-9b81-4017-8441-358aac34df13\") " pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.696791 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.696770 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/827e70a9-9b81-4017-8441-358aac34df13-must-gather-output\") pod \"must-gather-d2kn2\" (UID: \"827e70a9-9b81-4017-8441-358aac34df13\") " pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.704687 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.704658 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclqr\" (UniqueName: \"kubernetes.io/projected/827e70a9-9b81-4017-8441-358aac34df13-kube-api-access-hclqr\") pod \"must-gather-d2kn2\" (UID: \"827e70a9-9b81-4017-8441-358aac34df13\") " pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.740285 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.740251 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/must-gather-d2kn2" Apr 16 22:36:35.860631 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.860600 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/must-gather-d2kn2"] Apr 16 22:36:35.863620 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:36:35.863591 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod827e70a9_9b81_4017_8441_358aac34df13.slice/crio-7d39358570cc5e9e4a7476a69b2df8cc5ac90d83558256002b3a8e5f3071d4da WatchSource:0}: Error finding container 7d39358570cc5e9e4a7476a69b2df8cc5ac90d83558256002b3a8e5f3071d4da: Status 404 returned error can't find the container with id 7d39358570cc5e9e4a7476a69b2df8cc5ac90d83558256002b3a8e5f3071d4da Apr 16 22:36:35.865475 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:35.865452 2560 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:36:36.558493 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:36.558452 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/must-gather-d2kn2" event={"ID":"827e70a9-9b81-4017-8441-358aac34df13","Type":"ContainerStarted","Data":"7d39358570cc5e9e4a7476a69b2df8cc5ac90d83558256002b3a8e5f3071d4da"} Apr 16 22:36:37.566923 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:37.566864 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/must-gather-d2kn2" event={"ID":"827e70a9-9b81-4017-8441-358aac34df13","Type":"ContainerStarted","Data":"5113301956c3f954448458ec08e17ab8467081c98dcef7287b490fc5c97e9d84"} Apr 16 22:36:37.567393 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:37.566930 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/must-gather-d2kn2" event={"ID":"827e70a9-9b81-4017-8441-358aac34df13","Type":"ContainerStarted","Data":"ceca0774bbf43ab0c7143492ae0f66963016a904876953f3cadbbb12f2e8b196"} Apr 16 22:36:37.583228 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:37.583178 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wls74/must-gather-d2kn2" podStartSLOduration=1.7363752529999998 podStartE2EDuration="2.58316279s" podCreationTimestamp="2026-04-16 22:36:35 +0000 UTC" firstStartedPulling="2026-04-16 22:36:35.865578599 +0000 UTC m=+1370.673831729" lastFinishedPulling="2026-04-16 22:36:36.712366123 +0000 UTC m=+1371.520619266" observedRunningTime="2026-04-16 22:36:37.581455109 +0000 UTC m=+1372.389708260" watchObservedRunningTime="2026-04-16 22:36:37.58316279 +0000 UTC m=+1372.391415942" Apr 16 22:36:38.075377 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:38.075344 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9lc9p_816c1425-a29e-4e5e-b5ae-ad9b214b5349/global-pull-secret-syncer/0.log" Apr 16 22:36:38.169888 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:38.169839 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4frh7_fc7d6449-50b3-4246-90df-a37a0edc66d9/konnectivity-agent/0.log" Apr 16 22:36:38.291126 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:38.291094 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-35.ec2.internal_e12fe8cf6bc72ee22e06a7c9a61455c5/haproxy/0.log" Apr 16 22:36:42.074599 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:42.074572 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fxpxv_21a3f8ac-083c-407f-b480-6676a3c5c069/node-exporter/0.log" Apr 16 22:36:42.099379 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:42.099351 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fxpxv_21a3f8ac-083c-407f-b480-6676a3c5c069/kube-rbac-proxy/0.log" Apr 16 22:36:42.123001 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:42.122977 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fxpxv_21a3f8ac-083c-407f-b480-6676a3c5c069/init-textfile/0.log" Apr 16 22:36:42.601048 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:42.601025 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dfcfc8856-ksrx4_f22f2730-da51-48ae-9486-69fd5fa3e87e/telemeter-client/0.log" Apr 16 22:36:42.622126 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:42.622101 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dfcfc8856-ksrx4_f22f2730-da51-48ae-9486-69fd5fa3e87e/reload/0.log" Apr 16 22:36:42.645503 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:42.645474 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dfcfc8856-ksrx4_f22f2730-da51-48ae-9486-69fd5fa3e87e/kube-rbac-proxy/0.log" Apr 16 22:36:44.710912 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:44.710860 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dbdc49d55-ttn49_c04f52de-8bf1-49c0-b49d-5142d84557e0/console/0.log" Apr 16 22:36:45.296572 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.296538 2560 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n"] Apr 16 22:36:45.299780 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.299754 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.307100 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.307075 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n"] Apr 16 22:36:45.480188 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.480147 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-proc\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.480461 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.480435 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-podres\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.480567 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.480549 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-sys\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.480704 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.480689 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-lib-modules\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.480854 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.480841 2560 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b946h\" (UniqueName: \"kubernetes.io/projected/03296a43-2627-4486-9571-a3f6933cd543-kube-api-access-b946h\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.581721 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.581685 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b946h\" (UniqueName: \"kubernetes.io/projected/03296a43-2627-4486-9571-a3f6933cd543-kube-api-access-b946h\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.581946 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.581845 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-proc\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.581946 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.581905 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-podres\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.582056 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.581949 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-proc\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.582056 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.581954 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-sys\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.582056 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.582009 2560 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-lib-modules\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.582056 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.582012 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-sys\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.582201 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.582092 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-podres\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.582201 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.582121 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03296a43-2627-4486-9571-a3f6933cd543-lib-modules\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.589673 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.589650 2560 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b946h\" (UniqueName: \"kubernetes.io/projected/03296a43-2627-4486-9571-a3f6933cd543-kube-api-access-b946h\") pod \"perf-node-gather-daemonset-dqx4n\" (UID: \"03296a43-2627-4486-9571-a3f6933cd543\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.616095 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.616057 2560 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:45.762922 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.762728 2560 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n"] Apr 16 22:36:45.765510 ip-10-0-142-35 kubenswrapper[2560]: W0416 22:36:45.765466 2560 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod03296a43_2627_4486_9571_a3f6933cd543.slice/crio-fabe29a191ac6d070012baa748534f6af95b31898de3bcc77d703dcfd28867a4 WatchSource:0}: Error finding container fabe29a191ac6d070012baa748534f6af95b31898de3bcc77d703dcfd28867a4: Status 404 returned error can't find the container with id fabe29a191ac6d070012baa748534f6af95b31898de3bcc77d703dcfd28867a4 Apr 16 22:36:45.767317 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.767300 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rwmdq_d49b12b0-4e0c-4cb5-baba-dda53600ba56/dns/0.log" Apr 16 22:36:45.792254 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.792234 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rwmdq_d49b12b0-4e0c-4cb5-baba-dda53600ba56/kube-rbac-proxy/0.log" Apr 16 22:36:45.948926 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:45.948900 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p58zd_81cb0b16-9454-40e0-907f-db4de6741a0c/dns-node-resolver/0.log" Apr 16 22:36:46.324204 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:46.324176 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5d6966496b-phxbk_c4a5d274-9cc4-4ba2-b606-f554ecfd3f5d/registry/0.log" Apr 16 22:36:46.363013 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:46.362986 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bn8st_11442be0-1bee-4a8d-8374-44ea164b6268/node-ca/0.log" Apr 16 22:36:46.600077 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:46.599998 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" event={"ID":"03296a43-2627-4486-9571-a3f6933cd543","Type":"ContainerStarted","Data":"f511abfc4b95833b4a231adce58242fe4efe712eef15d1afe99448359b021047"} Apr 16 22:36:46.600077 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:46.600038 2560 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" event={"ID":"03296a43-2627-4486-9571-a3f6933cd543","Type":"ContainerStarted","Data":"fabe29a191ac6d070012baa748534f6af95b31898de3bcc77d703dcfd28867a4"} Apr 16 22:36:46.600255 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:46.600079 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:46.623702 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:46.623642 2560 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" podStartSLOduration=1.6236231079999999 podStartE2EDuration="1.623623108s" podCreationTimestamp="2026-04-16 22:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:46.622126411 +0000 UTC m=+1381.430379567" watchObservedRunningTime="2026-04-16 22:36:46.623623108 +0000 UTC m=+1381.431876261" Apr 16 22:36:47.382523 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:47.382491 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dnq97_47261062-7dc9-439a-ab78-089432ccd885/serve-healthcheck-canary/0.log" Apr 16 22:36:47.918616 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:47.918585 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rspvv_c2c07ee2-f936-48dd-bba9-993d5f5a4d00/kube-rbac-proxy/0.log" Apr 16 22:36:47.940181 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:47.940154 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rspvv_c2c07ee2-f936-48dd-bba9-993d5f5a4d00/exporter/0.log" Apr 16 22:36:47.959992 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:47.959963 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rspvv_c2c07ee2-f936-48dd-bba9-993d5f5a4d00/extractor/0.log" Apr 16 22:36:49.922460 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:49.922433 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-42hbz_3fa23ecb-011f-4af6-ae90-66bd39889f28/manager/0.log" Apr 16 22:36:52.616047 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:52.615198 2560 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-dqx4n" Apr 16 22:36:55.016022 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.015948 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gzw4_2b0c884f-a52c-4cb2-8c9e-b1036ea24b12/kube-multus/0.log" Apr 16 22:36:55.201567 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.201540 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/kube-multus-additional-cni-plugins/0.log" Apr 16 22:36:55.222214 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.222190 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/egress-router-binary-copy/0.log" Apr 16 22:36:55.242263 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.242226 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/cni-plugins/0.log" Apr 16 22:36:55.265782 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.265756 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/bond-cni-plugin/0.log" Apr 16 22:36:55.286838 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.286775 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/routeoverride-cni/0.log" Apr 16 22:36:55.308283 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.308252 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/whereabouts-cni-bincopy/0.log" Apr 16 22:36:55.332601 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.332555 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9sv6h_e68012e7-cccf-4da0-86e1-1355b80e2784/whereabouts-cni/0.log" Apr 16 22:36:55.648954 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.648925 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cph62_f45f69f8-87a8-49f9-bb8b-485368427802/network-metrics-daemon/0.log" Apr 16 22:36:55.671837 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:55.671808 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cph62_f45f69f8-87a8-49f9-bb8b-485368427802/kube-rbac-proxy/0.log" Apr 16 22:36:56.392321 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.392288 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-controller/0.log" Apr 16 22:36:56.412157 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.412118 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/0.log" Apr 16 22:36:56.419625 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.419604 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovn-acl-logging/1.log" Apr 16 22:36:56.438172 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.438151 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/kube-rbac-proxy-node/0.log" Apr 16 22:36:56.457948 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.457912 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:36:56.474478 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.474449 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/northd/0.log" Apr 16 22:36:56.495839 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.495814 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/nbdb/0.log" Apr 16 22:36:56.515718 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.515696 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/sbdb/0.log" Apr 16 22:36:56.636778 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:56.636700 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5ggql_703a2cee-8a1d-4b57-b34b-e3d59b2bc18a/ovnkube-controller/0.log" Apr 16 22:36:58.195766 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:58.195732 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qfmn8_fac69899-e7fe-4a77-b0b3-504c9f451bdf/network-check-target-container/0.log" Apr 16 22:36:59.024545 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:59.024513 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-249mr_29af9e08-7c9a-47dc-a8e2-f5af3aee18db/iptables-alerter/0.log" Apr 16 22:36:59.662904 ip-10-0-142-35 kubenswrapper[2560]: I0416 22:36:59.662819 2560 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7zwqh_40fbccdd-dab4-458a-93d4-a60daf555bc6/tuned/0.log"