Apr 16 19:51:19.199081 ip-10-0-136-138 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 19:51:19.199099 ip-10-0-136-138 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 19:51:19.199109 ip-10-0-136-138 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 19:51:19.199424 ip-10-0-136-138 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 19:51:29.438839 ip-10-0-136-138 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 19:51:29.438859 ip-10-0-136-138 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 754c0ef9f30448ebabac26364d5d8219 -- Apr 16 19:54:06.596121 ip-10-0-136-138 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:54:07.076999 ip-10-0-136-138 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:54:07.076999 ip-10-0-136-138 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:54:07.076999 ip-10-0-136-138 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:54:07.076999 ip-10-0-136-138 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:54:07.076999 ip-10-0-136-138 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:54:07.079936 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.079854 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:54:07.083022 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083008 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083023 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083027 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083030 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083033 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083036 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083038 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083041 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083044 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083046 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083049 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083051 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083054 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:07.083058 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083064 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083067 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083070 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083072 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083082 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083087 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083090 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083093 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083095 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083098 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083100 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083103 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083105 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083108 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083110 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083113 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083115 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083119 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083123 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:07.083380 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083127 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083131 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083134 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083137 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083140 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083142 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083145 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083147 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083150 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083152 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083155 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083157 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083160 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083162 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083166 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083169 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083172 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083175 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083177 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083181 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:07.083875 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083184 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083186 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083189 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083191 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083194 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083197 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083199 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083201 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083204 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083206 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083208 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083211 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083213 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083216 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083219 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083221 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083225 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083227 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083229 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083232 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:07.084389 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083234 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083237 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083239 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083241 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083244 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083247 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083250 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083253 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083256 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083258 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083260 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083263 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083266 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083268 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083637 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083642 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083645 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083647 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083650 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:07.084879 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083653 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083655 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083658 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083661 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083663 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083666 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083669 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083671 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083674 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083677 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083680 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083682 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083685 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083687 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083689 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083694 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083697 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083700 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083703 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:07.085327 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083706 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083710 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083713 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083715 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083718 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083720 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083723 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083725 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083727 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083730 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083732 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083734 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083737 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083740 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083742 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083745 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083748 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083750 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083753 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083756 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:07.085777 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083758 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083761 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083763 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083766 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083769 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083771 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083773 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083776 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083778 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083780 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083783 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083785 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083789 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083792 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083795 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083797 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083800 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083802 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083805 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083807 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:07.086285 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083809 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083812 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083814 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083817 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083819 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083823 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083826 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083828 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083846 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083849 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083851 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083854 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083857 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083860 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083863 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083865 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083868 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083870 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083872 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083875 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:07.086772 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083878 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.083880 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084955 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084964 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084970 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084975 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084979 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084984 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084988 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084993 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084996 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.084999 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085002 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085005 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085008 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085011 2567 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085014 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085017 2567 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085020 2567 flags.go:64] FLAG: --cloud-config="" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085022 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085026 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085031 2567 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085034 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085037 2567 flags.go:64] FLAG: --config-dir="" Apr 16 19:54:07.087299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085040 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085043 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085047 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085050 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085053 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085056 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085059 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085062 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085065 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085069 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085071 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085080 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085083 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085086 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085088 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085091 2567 flags.go:64] FLAG: --enable-server="true" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085095 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085099 2567 flags.go:64] FLAG: --event-burst="100" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085102 2567 flags.go:64] FLAG: --event-qps="50" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085105 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085108 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085111 2567 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085115 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085118 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085121 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:54:07.087895 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085124 2567 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085127 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085130 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085132 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085135 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085138 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085141 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085144 2567 flags.go:64] FLAG: --feature-gates="" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085148 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085151 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085153 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085156 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085159 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085162 2567 flags.go:64] FLAG: --help="false" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085165 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085168 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085171 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085173 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085177 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085180 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085182 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085185 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085188 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:54:07.088498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085191 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085194 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085197 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085200 2567 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085203 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085206 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085209 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085214 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085216 2567 flags.go:64] FLAG: --lock-file="" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085219 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085222 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085225 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085230 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085232 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085235 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085238 2567 flags.go:64] FLAG: --logging-format="text" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085241 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085244 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085247 2567 flags.go:64] FLAG: --manifest-url="" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085250 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085254 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085257 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085261 2567 flags.go:64] FLAG: --max-pods="110" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085263 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085266 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:54:07.089093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085269 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085272 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085275 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085278 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085280 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085288 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085291 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085294 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085297 2567 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085300 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085306 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085309 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085312 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085314 2567 flags.go:64] FLAG: --port="10250" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085317 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085321 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03c473fe239d196ad" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085324 2567 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085327 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085330 2567 flags.go:64] FLAG: --register-node="true" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085333 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085335 2567 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085339 2567 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085342 2567 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085344 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085347 2567 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:54:07.089683 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085351 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085354 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085357 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085359 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085362 2567 flags.go:64] FLAG: --runonce="false" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085365 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085368 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085371 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085374 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085377 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085379 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085382 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085385 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085388 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085391 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085393 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085396 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085400 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085403 2567 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085406 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085411 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085414 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085420 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085430 2567 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085433 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:54:07.090317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085436 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085439 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085442 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085444 2567 flags.go:64] FLAG: --v="2" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085453 2567 flags.go:64] FLAG: --version="false" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085457 2567 flags.go:64] FLAG: --vmodule="" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085461 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.085464 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086433 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086438 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086441 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086444 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086447 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086450 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086453 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086455 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086458 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086460 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086463 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086466 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086468 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086470 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:07.090934 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086473 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086475 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086478 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086482 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086485 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086489 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086491 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086495 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086498 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086500 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086502 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086505 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086507 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086510 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086513 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086515 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086518 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086521 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086523 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086525 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:07.091464 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086528 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086530 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086532 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086535 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086537 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086541 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086544 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086547 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086550 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086553 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086555 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086558 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086560 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086563 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086567 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086569 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086572 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086575 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086577 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:07.091991 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086579 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086582 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086584 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086587 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086589 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086592 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086594 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086596 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086598 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086601 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086603 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086606 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086608 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086610 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086613 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086615 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086618 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086620 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086622 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086625 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:07.092460 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086627 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086630 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086632 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086634 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086637 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086640 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086642 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086646 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086649 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086651 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086654 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086656 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.086658 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:07.092960 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.087185 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:54:07.094881 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.094862 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:54:07.094919 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.094883 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:54:07.094949 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094943 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:07.094949 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094949 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094952 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094955 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094958 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094962 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094965 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094968 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094972 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094977 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094980 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094983 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094986 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094989 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094992 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094995 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.094998 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095001 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095004 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095007 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:07.095008 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095010 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095013 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095024 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095027 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095030 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095032 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095035 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095037 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095040 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095042 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095045 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095047 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095050 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095052 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095055 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095057 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095060 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095062 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095066 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095069 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:07.095620 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095072 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095085 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095088 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095091 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095093 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095096 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095098 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095101 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095103 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095106 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095108 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095110 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095113 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095115 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095118 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095127 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095130 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095132 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095135 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095137 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:07.096425 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095140 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095142 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095145 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095147 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095150 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095152 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095155 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095157 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095160 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095162 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095165 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095167 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095169 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095172 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095174 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095177 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095179 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095181 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095184 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095186 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:07.097033 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095189 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095191 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095193 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095196 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095198 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095200 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.095206 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095739 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095755 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095759 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095764 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095769 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095774 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095779 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095789 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095793 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:07.097610 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095797 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095802 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095806 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095811 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095815 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095819 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095826 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095850 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095855 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095860 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095865 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095875 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095880 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095884 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095889 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095894 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095898 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095902 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095907 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095912 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:07.098079 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095917 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095921 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095926 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095931 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095941 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095945 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095950 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095954 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095958 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095963 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095967 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095972 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095977 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095981 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095985 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.095989 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096000 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096006 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096010 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:07.098557 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096014 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096018 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096023 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096027 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096031 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096035 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096039 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096044 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096048 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096058 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096061 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096066 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096070 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096075 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096079 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096084 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096087 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096092 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096096 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096100 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:07.099037 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096104 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096108 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096117 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096121 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096125 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096129 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096133 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096137 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096141 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096145 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096149 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096154 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096158 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096162 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096166 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096175 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096179 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:07.099565 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:07.096184 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:07.099988 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.096192 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:54:07.099988 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.097890 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:54:07.102110 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.102095 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:54:07.103131 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.103120 2567 server.go:1019] "Starting client certificate rotation" Apr 16 19:54:07.103226 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.103211 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:54:07.104172 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.104161 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:54:07.133257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.133238 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:54:07.138646 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.138613 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:54:07.150092 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.150077 2567 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:54:07.157446 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.157427 2567 log.go:25] "Validated CRI v1 image API" Apr 16 19:54:07.159368 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.159349 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:54:07.162496 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.162468 2567 fs.go:135] Filesystem UUIDs: map[668e04f2-6322-4309-8967-206443e398fd:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fcde7735-b66a-4e4d-a722-b57a05f54788:/dev/nvme0n1p3] Apr 16 19:54:07.162569 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.162492 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:54:07.167712 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.167607 2567 manager.go:217] Machine: {Timestamp:2026-04-16 19:54:07.166246055 +0000 UTC m=+0.441922620 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092879 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28b922871698f76d77413c2411dcc0 SystemUUID:ec28b922-8716-98f7-6d77-413c2411dcc0 BootID:754c0ef9-f304-48eb-abac-26364d5d8219 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:ea:de:76:cf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:ea:de:76:cf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:4a:3f:5b:1e:96 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:54:07.167712 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.167705 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:54:07.167818 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.167809 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:54:07.170750 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.170731 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:54:07.171195 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.171169 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:54:07.171337 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.171198 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-138.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:54:07.171386 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.171346 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:54:07.171386 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.171355 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:54:07.171386 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.171368 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:54:07.171464 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.171390 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:54:07.173023 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.173012 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:54:07.173127 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.173118 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:54:07.175861 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.175851 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:54:07.175903 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.175873 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:54:07.175903 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.175892 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:54:07.175985 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.175907 2567 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:54:07.175985 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.175924 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:54:07.179744 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.179723 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:54:07.179825 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.179765 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:54:07.183187 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.183159 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:54:07.184879 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.184866 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:54:07.185120 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.185100 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:54:07.185186 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.185159 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-138.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:54:07.186672 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186659 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186677 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186684 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186690 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186695 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186700 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186706 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186711 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:54:07.186716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186719 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:54:07.186968 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186725 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:54:07.186968 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186733 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:54:07.186968 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.186742 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:54:07.188077 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.188067 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:54:07.188077 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.188077 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:54:07.191274 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.191259 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-138.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:54:07.191413 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.191401 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:54:07.191460 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.191437 2567 server.go:1295] "Started kubelet" Apr 16 19:54:07.191532 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.191508 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:54:07.191586 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.191537 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:54:07.191637 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.191591 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:54:07.192182 ip-10-0-136-138 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:54:07.192677 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.192590 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:54:07.194016 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.194002 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:54:07.198024 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.198007 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x9p65" Apr 16 19:54:07.198446 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.198430 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:54:07.199037 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.199021 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:54:07.200062 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.199884 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:54:07.200062 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.199901 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:54:07.200062 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.199911 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.200062 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.199952 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:54:07.200062 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.200053 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:54:07.200062 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.200061 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:54:07.200366 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.200281 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:54:07.202270 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202245 2567 factory.go:153] Registering CRI-O factory Apr 16 19:54:07.202270 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202271 2567 factory.go:223] Registration of the crio container factory successfully Apr 16 19:54:07.202417 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202325 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:54:07.202417 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202332 2567 factory.go:55] Registering systemd factory Apr 16 19:54:07.202417 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202340 2567 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:54:07.202417 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202372 2567 factory.go:103] Registering Raw factory Apr 16 19:54:07.202417 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.202382 2567 manager.go:1196] Started watching for new ooms in manager Apr 16 19:54:07.204608 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.204524 2567 manager.go:319] Starting recovery of all containers Apr 16 19:54:07.206309 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.206219 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:54:07.206410 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.206359 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x9p65" Apr 16 19:54:07.208303 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.206970 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-138.ec2.internal.18a6ee6b30c5b746 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-138.ec2.internal,UID:ip-10-0-136-138.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-138.ec2.internal,},FirstTimestamp:2026-04-16 19:54:07.191414598 +0000 UTC m=+0.467091164,LastTimestamp:2026-04-16 19:54:07.191414598 +0000 UTC m=+0.467091164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-138.ec2.internal,}" Apr 16 19:54:07.208419 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.208401 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-138.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:54:07.216075 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.216061 2567 manager.go:324] Recovery completed Apr 16 19:54:07.217693 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.217669 2567 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 19:54:07.220509 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.220497 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:07.223300 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.223280 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:07.223378 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.223311 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:07.223378 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.223325 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:07.223898 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.223883 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:54:07.223959 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.223900 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:54:07.223959 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.223918 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:54:07.226434 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.226423 2567 policy_none.go:49] "None policy: Start" Apr 16 19:54:07.226472 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.226438 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:54:07.226472 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.226448 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266383 2567 manager.go:341] "Starting Device Plugin manager" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.266415 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266425 2567 server.go:85] "Starting device plugin registration server" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266686 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266721 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266872 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266943 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.266953 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.267441 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:54:07.271367 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.267609 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.320649 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.320623 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:54:07.321951 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.321934 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:54:07.322048 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.321958 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:54:07.322048 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.321980 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:54:07.322048 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.321990 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:54:07.322202 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.322064 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:54:07.325493 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.325476 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:07.367687 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.367627 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:07.368421 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.368403 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:07.368494 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.368436 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:07.368494 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.368449 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:07.368494 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.368471 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.377352 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.377332 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.377444 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.377355 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-138.ec2.internal\": node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.393340 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.393310 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.423034 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.423009 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal"] Apr 16 19:54:07.423131 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.423087 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:07.424786 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.424770 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:07.424888 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.424802 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:07.424888 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.424818 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:07.425960 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.425945 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:07.426077 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.426126 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426091 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:07.426638 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426624 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:07.426703 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426635 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:07.426703 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426653 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:07.426703 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426659 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:07.426703 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426664 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:07.426703 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.426673 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:07.428159 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.428143 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.428210 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.428175 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:07.428820 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.428802 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:07.428908 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.428830 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:07.428908 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.428861 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:07.449332 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.449311 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-138.ec2.internal\" not found" node="ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.453123 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.453110 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-138.ec2.internal\" not found" node="ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.494102 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.494080 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.594512 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.594496 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.601873 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.601858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/af31fba9843cd1e10fca39c5f1285846-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal\" (UID: \"af31fba9843cd1e10fca39c5f1285846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.601923 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.601883 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af31fba9843cd1e10fca39c5f1285846-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal\" (UID: \"af31fba9843cd1e10fca39c5f1285846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.601923 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.601908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21c919eac4810510b21920a294dfc127-config\") pod \"kube-apiserver-proxy-ip-10-0-136-138.ec2.internal\" (UID: \"21c919eac4810510b21920a294dfc127\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.695236 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.695213 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.702613 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.702597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af31fba9843cd1e10fca39c5f1285846-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal\" (UID: \"af31fba9843cd1e10fca39c5f1285846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.702667 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.702624 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21c919eac4810510b21920a294dfc127-config\") pod \"kube-apiserver-proxy-ip-10-0-136-138.ec2.internal\" (UID: \"21c919eac4810510b21920a294dfc127\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.702667 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.702640 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/af31fba9843cd1e10fca39c5f1285846-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal\" (UID: \"af31fba9843cd1e10fca39c5f1285846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.702800 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.702692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/af31fba9843cd1e10fca39c5f1285846-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal\" (UID: \"af31fba9843cd1e10fca39c5f1285846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.702800 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.702712 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af31fba9843cd1e10fca39c5f1285846-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal\" (UID: \"af31fba9843cd1e10fca39c5f1285846\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.702800 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.702722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21c919eac4810510b21920a294dfc127-config\") pod \"kube-apiserver-proxy-ip-10-0-136-138.ec2.internal\" (UID: \"21c919eac4810510b21920a294dfc127\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.751737 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.751719 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.756213 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:07.756197 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" Apr 16 19:54:07.796219 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.796198 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.896748 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.896724 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:07.997333 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:07.997264 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:08.097811 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:08.097789 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:08.103241 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.103223 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:54:08.103383 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.103367 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:54:08.164099 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.164073 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:08.198183 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:08.198141 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:08.199293 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.199274 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:54:08.207337 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.207319 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:54:08.208500 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.208479 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:49:07 +0000 UTC" deadline="2027-10-12 18:50:43.914270036 +0000 UTC" Apr 16 19:54:08.208571 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.208500 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13054h56m35.705772181s" Apr 16 19:54:08.212155 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:08.212132 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf31fba9843cd1e10fca39c5f1285846.slice/crio-98e5993e14de8fd556046c8c4872d5b21644b509bda74f01a11b1933010a3d41 WatchSource:0}: Error finding container 98e5993e14de8fd556046c8c4872d5b21644b509bda74f01a11b1933010a3d41: Status 404 returned error can't find the container with id 98e5993e14de8fd556046c8c4872d5b21644b509bda74f01a11b1933010a3d41 Apr 16 19:54:08.212758 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:08.212730 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c919eac4810510b21920a294dfc127.slice/crio-e4668f12c46331063e965c37229244e65fe45d3a284919fe438103e09887bb99 WatchSource:0}: Error finding container e4668f12c46331063e965c37229244e65fe45d3a284919fe438103e09887bb99: Status 404 returned error can't find the container with id e4668f12c46331063e965c37229244e65fe45d3a284919fe438103e09887bb99 Apr 16 19:54:08.219060 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.219047 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:54:08.230529 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.230511 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hbc2h" Apr 16 19:54:08.241000 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.240980 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hbc2h" Apr 16 19:54:08.299146 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:08.299091 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:08.326009 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.325920 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" event={"ID":"af31fba9843cd1e10fca39c5f1285846","Type":"ContainerStarted","Data":"98e5993e14de8fd556046c8c4872d5b21644b509bda74f01a11b1933010a3d41"} Apr 16 19:54:08.327619 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.327592 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" event={"ID":"21c919eac4810510b21920a294dfc127","Type":"ContainerStarted","Data":"e4668f12c46331063e965c37229244e65fe45d3a284919fe438103e09887bb99"} Apr 16 19:54:08.399881 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:08.399851 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:08.500298 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:08.500265 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-138.ec2.internal\" not found" Apr 16 19:54:08.532611 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.532581 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:08.600422 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.600355 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" Apr 16 19:54:08.611011 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.610987 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:08.613032 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.613008 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" Apr 16 19:54:08.620722 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.620704 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:08.751866 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.751626 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:08.938826 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:08.938794 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:09.177093 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.177060 2567 apiserver.go:52] "Watching apiserver" Apr 16 19:54:09.181411 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.181392 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:54:09.181756 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.181732 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-lr7sw","kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal","openshift-cluster-node-tuning-operator/tuned-qldv9","openshift-image-registry/node-ca-xz5br","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal","openshift-multus/network-metrics-daemon-x5ml5","openshift-network-diagnostics/network-check-target-f7rhh","openshift-network-operator/iptables-alerter-h7tf4","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk","openshift-multus/multus-additional-cni-plugins-f7hpw","openshift-multus/multus-zh765","openshift-ovn-kubernetes/ovnkube-node-j56zc"] Apr 16 19:54:09.183201 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.183182 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.185428 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.185094 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.185428 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.185178 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:54:09.185428 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.185231 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:54:09.185428 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.185180 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zhggq\"" Apr 16 19:54:09.186401 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.186381 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.186624 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.186607 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r72cj\"" Apr 16 19:54:09.186999 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.186984 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:09.186999 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.186991 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:09.187792 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.187458 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.187792 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.187581 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:09.187980 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.187941 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:54:09.188267 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.188247 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:54:09.188355 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.188314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xsjdh\"" Apr 16 19:54:09.188481 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.188458 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:54:09.188573 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.188524 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:09.188642 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.188578 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:09.188697 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.188656 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.189992 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.189807 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.190210 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.190192 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:54:09.190351 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.190335 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:09.190677 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.190659 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:09.190746 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.190727 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gklx6\"" Apr 16 19:54:09.191796 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.191424 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:54:09.191796 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.191662 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:54:09.191893 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.191818 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:54:09.192178 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.192151 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-89t4j\"" Apr 16 19:54:09.193070 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.192311 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.193070 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.192403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zh765" Apr 16 19:54:09.193599 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.193577 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.194255 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194236 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:54:09.194409 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194376 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dc4zp\"" Apr 16 19:54:09.194495 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194478 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:54:09.194563 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194530 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:54:09.194676 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194662 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:54:09.194757 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194722 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:54:09.194814 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194777 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9tprm\"" Apr 16 19:54:09.194949 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.194933 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:54:09.196768 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.196751 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:54:09.197867 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.196880 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:54:09.197867 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.197349 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:54:09.197867 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.197675 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:54:09.198111 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.198094 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:54:09.198388 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.198357 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d5svt\"" Apr 16 19:54:09.198932 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.198912 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:54:09.201542 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.201523 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:54:09.210758 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210739 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-cni-bin\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.210871 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-sys-fs\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.210871 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysconfig\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.210871 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210853 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctrm\" (UniqueName: \"kubernetes.io/projected/5f32c181-e6f9-4aa8-b370-e213007636e9-kube-api-access-lctrm\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.211041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-cnibin\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.211041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-daemon-config\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.211041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210953 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-etc-kubernetes\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.211041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.210979 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42c7c\" (UniqueName: \"kubernetes.io/projected/be3aeb91-80a8-4720-87d2-6479ec1370fc-kube-api-access-42c7c\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nttv5\" (UniqueName: \"kubernetes.io/projected/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-kube-api-access-nttv5\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.211041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211032 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f32c181-e6f9-4aa8-b370-e213007636e9-serviceca\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cnibin\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211108 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-kubernetes\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211155 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-var-lib-kubelet\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-kubelet\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-ovn\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysctl-d\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211245 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.211276 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-netns\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211320 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211342 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-run-ovn-kubernetes\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211365 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysctl-conf\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211395 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-tmp\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211416 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-cni-bin\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211440 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v5bv\" (UniqueName: \"kubernetes.io/projected/28cb7eb4-d997-43b5-a1a2-73abb55230e3-kube-api-access-5v5bv\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-systemd-units\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211497 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovnkube-script-lib\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211545 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v44v2\" (UniqueName: \"kubernetes.io/projected/f4789667-3ad6-413b-9c9e-a072e7b79d5d-kube-api-access-v44v2\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.211588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-system-cni-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-hostroot\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211665 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-etc-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-modprobe-d\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-run\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211769 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-host\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-os-release\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-conf-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsv7l\" (UniqueName: \"kubernetes.io/projected/1a8e9f53-a312-4d09-93df-0fd1a68610ff-kube-api-access-gsv7l\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211920 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28cb7eb4-d997-43b5-a1a2-73abb55230e3-iptables-alerter-script\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211958 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-env-overrides\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.211990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-cni-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbm4\" (UniqueName: \"kubernetes.io/projected/ad01a266-64de-4515-920d-5076e4f40e3f-kube-api-access-vpbm4\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-var-lib-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-cni-netd\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212111 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28cb7eb4-d997-43b5-a1a2-73abb55230e3-host-slash\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f32c181-e6f9-4aa8-b370-e213007636e9-host\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212154 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-multus-certs\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-registration-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e13fe6ed-e68d-4328-9562-990f76414842-agent-certs\") pod \"konnectivity-agent-lr7sw\" (UID: \"e13fe6ed-e68d-4328-9562-990f76414842\") " pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212228 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-tuned\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212251 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-kubelet\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212298 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-systemd\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212323 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovnkube-config\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212370 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-lib-modules\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212445 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-cni-multus\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-socket-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212543 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-node-log\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.212706 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-sys\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a8e9f53-a312-4d09-93df-0fd1a68610ff-cni-binary-copy\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-system-cni-dir\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-device-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212726 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-slash\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212754 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-run-netns\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212779 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212806 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-os-release\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-socket-dir-parent\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212907 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovn-node-metrics-cert\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e13fe6ed-e68d-4328-9562-990f76414842-konnectivity-ca\") pod \"konnectivity-agent-lr7sw\" (UID: \"e13fe6ed-e68d-4328-9562-990f76414842\") " pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212957 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-systemd\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-k8s-cni-cncf-io\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.212996 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-log-socket\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.213020 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.213454 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.213045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5std\" (UniqueName: \"kubernetes.io/projected/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-kube-api-access-r5std\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.241644 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.241604 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:49:08 +0000 UTC" deadline="2027-11-16 07:29:01.478641047 +0000 UTC" Apr 16 19:54:09.241644 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.241644 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13883h34m52.237000615s" Apr 16 19:54:09.313967 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.313931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.313976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-run-ovn-kubernetes\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314027 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysctl-conf\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314050 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-tmp\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-cni-bin\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v5bv\" (UniqueName: \"kubernetes.io/projected/28cb7eb4-d997-43b5-a1a2-73abb55230e3-kube-api-access-5v5bv\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-etc-selinux\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.314140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-systemd-units\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovnkube-script-lib\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v44v2\" (UniqueName: \"kubernetes.io/projected/f4789667-3ad6-413b-9c9e-a072e7b79d5d-kube-api-access-v44v2\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-run-ovn-kubernetes\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314248 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-system-cni-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-hostroot\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314287 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-etc-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-modprobe-d\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-run\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-host\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysctl-conf\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-os-release\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314400 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-host\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-conf-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsv7l\" (UniqueName: \"kubernetes.io/projected/1a8e9f53-a312-4d09-93df-0fd1a68610ff-kube-api-access-gsv7l\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28cb7eb4-d997-43b5-a1a2-73abb55230e3-iptables-alerter-script\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.314512 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-env-overrides\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314505 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-cni-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbm4\" (UniqueName: \"kubernetes.io/projected/ad01a266-64de-4515-920d-5076e4f40e3f-kube-api-access-vpbm4\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-var-lib-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-cni-netd\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314578 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-hostroot\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28cb7eb4-d997-43b5-a1a2-73abb55230e3-host-slash\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314625 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f32c181-e6f9-4aa8-b370-e213007636e9-host\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-multus-certs\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314670 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-registration-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e13fe6ed-e68d-4328-9562-990f76414842-agent-certs\") pod \"konnectivity-agent-lr7sw\" (UID: \"e13fe6ed-e68d-4328-9562-990f76414842\") " pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314698 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314714 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-tuned\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-modprobe-d\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314740 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-kubelet\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314741 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-systemd\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.315405 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314797 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-system-cni-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovnkube-config\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-lib-modules\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f32c181-e6f9-4aa8-b370-e213007636e9-host\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-cni-multus\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-os-release\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314940 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-socket-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314625 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-etc-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.314768 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-run\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315017 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-systemd\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-conf-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-node-log\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-sys\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315119 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a8e9f53-a312-4d09-93df-0fd1a68610ff-cni-binary-copy\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-system-cni-dir\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-device-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-slash\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.316245 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315239 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-run-netns\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-os-release\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-var-lib-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-socket-dir-parent\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315413 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-cni-bin\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315438 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-system-cni-dir\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-systemd-units\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315481 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovn-node-metrics-cert\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315504 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e13fe6ed-e68d-4328-9562-990f76414842-konnectivity-ca\") pod \"konnectivity-agent-lr7sw\" (UID: \"e13fe6ed-e68d-4328-9562-990f76414842\") " pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-sys\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-systemd\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-k8s-cni-cncf-io\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-socket-dir-parent\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-log-socket\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.315591 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315599 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-kubelet\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.317136 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315620 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5std\" (UniqueName: \"kubernetes.io/projected/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-kube-api-access-r5std\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-cni-netd\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-cni-bin\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.315680 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:09.815646759 +0000 UTC m=+3.091323343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315704 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28cb7eb4-d997-43b5-a1a2-73abb55230e3-host-slash\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-sys-fs\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315736 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-node-log\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-registration-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315778 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-env-overrides\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315794 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysconfig\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315870 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-multus-certs\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-cni-bin\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316007 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-sys-fs\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovnkube-script-lib\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316331 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28cb7eb4-d997-43b5-a1a2-73abb55230e3-iptables-alerter-script\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.317880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316376 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-device-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316414 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad01a266-64de-4515-920d-5076e4f40e3f-socket-dir\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-log-socket\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316559 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a8e9f53-a312-4d09-93df-0fd1a68610ff-cni-binary-copy\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-cni-dir\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316642 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-slash\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316816 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-run-netns\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-systemd\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-os-release\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.315747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysconfig\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316930 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e13fe6ed-e68d-4328-9562-990f76414842-konnectivity-ca\") pod \"konnectivity-agent-lr7sw\" (UID: \"e13fe6ed-e68d-4328-9562-990f76414842\") " pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316940 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lctrm\" (UniqueName: \"kubernetes.io/projected/5f32c181-e6f9-4aa8-b370-e213007636e9-kube-api-access-lctrm\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316945 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-var-lib-cni-multus\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.316983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-k8s-cni-cncf-io\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317019 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-lib-modules\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-cnibin\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-daemon-config\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317187 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-etc-kubernetes\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.318555 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42c7c\" (UniqueName: \"kubernetes.io/projected/be3aeb91-80a8-4720-87d2-6479ec1370fc-kube-api-access-42c7c\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nttv5\" (UniqueName: \"kubernetes.io/projected/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-kube-api-access-nttv5\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f32c181-e6f9-4aa8-b370-e213007636e9-serviceca\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cnibin\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-kubernetes\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317405 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-var-lib-kubelet\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-kubelet\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-ovn\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317480 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysctl-d\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317768 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-netns\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317876 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-host-run-netns\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.317936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-tuned\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cnibin\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318260 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318254 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovnkube-config\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.319418 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318312 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-kubernetes\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-cnibin\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-etc-sysctl-d\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318359 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-var-lib-kubelet\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-ovn\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-run-openvswitch\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-kubelet\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be3aeb91-80a8-4720-87d2-6479ec1370fc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f32c181-e6f9-4aa8-b370-e213007636e9-serviceca\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a8e9f53-a312-4d09-93df-0fd1a68610ff-etc-kubernetes\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be3aeb91-80a8-4720-87d2-6479ec1370fc-ovn-node-metrics-cert\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.318740 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a8e9f53-a312-4d09-93df-0fd1a68610ff-multus-daemon-config\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.319083 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.320180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.319090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.321487 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.320951 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:09.321487 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.320978 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:09.321487 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.320992 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ffszx for pod openshift-network-diagnostics/network-check-target-f7rhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:09.321487 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.321076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e13fe6ed-e68d-4328-9562-990f76414842-agent-certs\") pod \"konnectivity-agent-lr7sw\" (UID: \"e13fe6ed-e68d-4328-9562-990f76414842\") " pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.321487 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.321104 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx podName:fba32900-cb28-4cb9-8c67-8874eb5f06ae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:09.821087102 +0000 UTC m=+3.096763672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ffszx" (UniqueName: "kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx") pod "network-check-target-f7rhh" (UID: "fba32900-cb28-4cb9-8c67-8874eb5f06ae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:09.321487 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.321221 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-tmp\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.323875 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.323586 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsv7l\" (UniqueName: \"kubernetes.io/projected/1a8e9f53-a312-4d09-93df-0fd1a68610ff-kube-api-access-gsv7l\") pod \"multus-zh765\" (UID: \"1a8e9f53-a312-4d09-93df-0fd1a68610ff\") " pod="openshift-multus/multus-zh765" Apr 16 19:54:09.324185 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.324162 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v44v2\" (UniqueName: \"kubernetes.io/projected/f4789667-3ad6-413b-9c9e-a072e7b79d5d-kube-api-access-v44v2\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.331807 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.331786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v5bv\" (UniqueName: \"kubernetes.io/projected/28cb7eb4-d997-43b5-a1a2-73abb55230e3-kube-api-access-5v5bv\") pod \"iptables-alerter-h7tf4\" (UID: \"28cb7eb4-d997-43b5-a1a2-73abb55230e3\") " pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.332605 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.332491 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbm4\" (UniqueName: \"kubernetes.io/projected/ad01a266-64de-4515-920d-5076e4f40e3f-kube-api-access-vpbm4\") pod \"aws-ebs-csi-driver-node-58lpk\" (UID: \"ad01a266-64de-4515-920d-5076e4f40e3f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.332605 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.332562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctrm\" (UniqueName: \"kubernetes.io/projected/5f32c181-e6f9-4aa8-b370-e213007636e9-kube-api-access-lctrm\") pod \"node-ca-xz5br\" (UID: \"5f32c181-e6f9-4aa8-b370-e213007636e9\") " pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.333043 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.333023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42c7c\" (UniqueName: \"kubernetes.io/projected/be3aeb91-80a8-4720-87d2-6479ec1370fc-kube-api-access-42c7c\") pod \"ovnkube-node-j56zc\" (UID: \"be3aeb91-80a8-4720-87d2-6479ec1370fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.333115 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.333058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nttv5\" (UniqueName: \"kubernetes.io/projected/f261d50e-6c86-49ca-ad32-2c77ac5ecb6a-kube-api-access-nttv5\") pod \"tuned-qldv9\" (UID: \"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a\") " pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.338464 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.338442 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5std\" (UniqueName: \"kubernetes.io/projected/3753de4d-d4c5-4f6d-a1a3-bb06177d48f5-kube-api-access-r5std\") pod \"multus-additional-cni-plugins-f7hpw\" (UID: \"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5\") " pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.497881 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.497782 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:09.506722 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.506700 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qldv9" Apr 16 19:54:09.514333 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.514312 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xz5br" Apr 16 19:54:09.521930 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.521903 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h7tf4" Apr 16 19:54:09.529468 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.529441 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" Apr 16 19:54:09.534985 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.534966 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" Apr 16 19:54:09.541552 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.541534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zh765" Apr 16 19:54:09.546121 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.546100 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:09.821933 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.821825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:09.821933 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:09.821896 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:09.822151 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.821988 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:09.822151 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.822005 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:09.822151 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.822009 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:09.822151 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.822028 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ffszx for pod openshift-network-diagnostics/network-check-target-f7rhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:09.822151 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.822067 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.822046322 +0000 UTC m=+4.097722892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:09.822151 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:09.822087 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx podName:fba32900-cb28-4cb9-8c67-8874eb5f06ae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.822077501 +0000 UTC m=+4.097754057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ffszx" (UniqueName: "kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx") pod "network-check-target-f7rhh" (UID: "fba32900-cb28-4cb9-8c67-8874eb5f06ae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:09.880933 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:09.880901 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe3aeb91_80a8_4720_87d2_6479ec1370fc.slice/crio-70f2db66ab82c49f93aea380e5b28501039f816e450048ca9098d363bc612d93 WatchSource:0}: Error finding container 70f2db66ab82c49f93aea380e5b28501039f816e450048ca9098d363bc612d93: Status 404 returned error can't find the container with id 70f2db66ab82c49f93aea380e5b28501039f816e450048ca9098d363bc612d93 Apr 16 19:54:09.882342 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:09.882314 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad01a266_64de_4515_920d_5076e4f40e3f.slice/crio-e45aa7188091a34f5b885a28b442dd9f99160c9772315aba4fec75251819c10a WatchSource:0}: Error finding container e45aa7188091a34f5b885a28b442dd9f99160c9772315aba4fec75251819c10a: Status 404 returned error can't find the container with id e45aa7188091a34f5b885a28b442dd9f99160c9772315aba4fec75251819c10a Apr 16 19:54:09.883852 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:09.883772 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f32c181_e6f9_4aa8_b370_e213007636e9.slice/crio-44e12ebfac20e4697a0c625c9f6dbc8019ce8c27574281278a7adaf7089166f7 WatchSource:0}: Error finding container 44e12ebfac20e4697a0c625c9f6dbc8019ce8c27574281278a7adaf7089166f7: Status 404 returned error can't find the container with id 44e12ebfac20e4697a0c625c9f6dbc8019ce8c27574281278a7adaf7089166f7 Apr 16 19:54:09.886325 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:09.886302 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cb7eb4_d997_43b5_a1a2_73abb55230e3.slice/crio-c56e5a7da599fc5f6e64a520abb0d93362264b1c08ef758b40b29fefd2efbb7e WatchSource:0}: Error finding container c56e5a7da599fc5f6e64a520abb0d93362264b1c08ef758b40b29fefd2efbb7e: Status 404 returned error can't find the container with id c56e5a7da599fc5f6e64a520abb0d93362264b1c08ef758b40b29fefd2efbb7e Apr 16 19:54:09.887611 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:09.887589 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8e9f53_a312_4d09_93df_0fd1a68610ff.slice/crio-9bd5a4a734c711ac6d6fb2079dbb5b65ab19be1470dad1810750a9dd226b9d68 WatchSource:0}: Error finding container 9bd5a4a734c711ac6d6fb2079dbb5b65ab19be1470dad1810750a9dd226b9d68: Status 404 returned error can't find the container with id 9bd5a4a734c711ac6d6fb2079dbb5b65ab19be1470dad1810750a9dd226b9d68 Apr 16 19:54:09.888995 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:09.888970 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13fe6ed_e68d_4328_9562_990f76414842.slice/crio-cb3eab300bdbea1ffed5c49294a5f773b5ecd4a6a24249c254d37dd0d75b1fdc WatchSource:0}: Error finding container cb3eab300bdbea1ffed5c49294a5f773b5ecd4a6a24249c254d37dd0d75b1fdc: Status 404 returned error can't find the container with id cb3eab300bdbea1ffed5c49294a5f773b5ecd4a6a24249c254d37dd0d75b1fdc Apr 16 19:54:10.245219 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.244967 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:49:08 +0000 UTC" deadline="2027-12-15 12:18:04.900878255 +0000 UTC" Apr 16 19:54:10.245219 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.245218 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14584h23m54.655666298s" Apr 16 19:54:10.335676 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.335623 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xz5br" event={"ID":"5f32c181-e6f9-4aa8-b370-e213007636e9","Type":"ContainerStarted","Data":"44e12ebfac20e4697a0c625c9f6dbc8019ce8c27574281278a7adaf7089166f7"} Apr 16 19:54:10.337134 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.337078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" event={"ID":"ad01a266-64de-4515-920d-5076e4f40e3f","Type":"ContainerStarted","Data":"e45aa7188091a34f5b885a28b442dd9f99160c9772315aba4fec75251819c10a"} Apr 16 19:54:10.339096 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.339029 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"70f2db66ab82c49f93aea380e5b28501039f816e450048ca9098d363bc612d93"} Apr 16 19:54:10.342180 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.341749 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" event={"ID":"21c919eac4810510b21920a294dfc127","Type":"ContainerStarted","Data":"366d094605e94356cb7689875cd27e7cb8f42e652dc0539538ac59da3d0329a2"} Apr 16 19:54:10.346047 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.345214 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qldv9" event={"ID":"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a","Type":"ContainerStarted","Data":"f220e4d6ae51b61acf314fd722bb09ae55c48e6d803631bad2fe81fd049d1fc0"} Apr 16 19:54:10.347126 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.347085 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zh765" event={"ID":"1a8e9f53-a312-4d09-93df-0fd1a68610ff","Type":"ContainerStarted","Data":"9bd5a4a734c711ac6d6fb2079dbb5b65ab19be1470dad1810750a9dd226b9d68"} Apr 16 19:54:10.356052 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.356022 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h7tf4" event={"ID":"28cb7eb4-d997-43b5-a1a2-73abb55230e3","Type":"ContainerStarted","Data":"c56e5a7da599fc5f6e64a520abb0d93362264b1c08ef758b40b29fefd2efbb7e"} Apr 16 19:54:10.357317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.357271 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-138.ec2.internal" podStartSLOduration=2.357256643 podStartE2EDuration="2.357256643s" podCreationTimestamp="2026-04-16 19:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:10.356360463 +0000 UTC m=+3.632037037" watchObservedRunningTime="2026-04-16 19:54:10.357256643 +0000 UTC m=+3.632933221" Apr 16 19:54:10.360363 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.360316 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lr7sw" event={"ID":"e13fe6ed-e68d-4328-9562-990f76414842","Type":"ContainerStarted","Data":"cb3eab300bdbea1ffed5c49294a5f773b5ecd4a6a24249c254d37dd0d75b1fdc"} Apr 16 19:54:10.364463 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.364443 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerStarted","Data":"5f229995b3879c970f2b528d7e0f20b002a43e4062e1ee3619dd4af6ed418eab"} Apr 16 19:54:10.830362 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.830329 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:10.830505 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:10.830385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:10.830570 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:10.830528 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.830621 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:10.830582 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.830562445 +0000 UTC m=+6.106239004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.830687 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:10.830660 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:10.830687 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:10.830674 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:10.830687 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:10.830686 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ffszx for pod openshift-network-diagnostics/network-check-target-f7rhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:10.830944 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:10.830721 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx podName:fba32900-cb28-4cb9-8c67-8874eb5f06ae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:12.830709304 +0000 UTC m=+6.106385858 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ffszx" (UniqueName: "kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx") pod "network-check-target-f7rhh" (UID: "fba32900-cb28-4cb9-8c67-8874eb5f06ae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:11.323558 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.323008 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:11.323558 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:11.323140 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:11.323558 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.323401 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:11.323558 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:11.323496 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:11.378192 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.376998 2567 generic.go:358] "Generic (PLEG): container finished" podID="af31fba9843cd1e10fca39c5f1285846" containerID="4334e77fbc9337e21c697268f8310d310ee2d4af1095adc75cb15eccb25e7490" exitCode=0 Apr 16 19:54:11.378192 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.377854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" event={"ID":"af31fba9843cd1e10fca39c5f1285846","Type":"ContainerDied","Data":"4334e77fbc9337e21c697268f8310d310ee2d4af1095adc75cb15eccb25e7490"} Apr 16 19:54:11.639132 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.639102 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w8m7p"] Apr 16 19:54:11.641127 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.641104 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.643950 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.643919 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:11.644292 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.644276 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:11.644504 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.644487 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x284t\"" Apr 16 19:54:11.736646 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.736603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-hosts-file\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.736817 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.736656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28fpl\" (UniqueName: \"kubernetes.io/projected/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-kube-api-access-28fpl\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.736817 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.736696 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-tmp-dir\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.837284 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.837245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-hosts-file\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.837448 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.837302 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28fpl\" (UniqueName: \"kubernetes.io/projected/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-kube-api-access-28fpl\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.837448 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.837342 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-tmp-dir\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.837714 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.837692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-tmp-dir\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.837790 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.837773 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-hosts-file\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.867902 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.866664 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28fpl\" (UniqueName: \"kubernetes.io/projected/24f35a04-1a02-4c6d-86fb-0b68fcd8fbec-kube-api-access-28fpl\") pod \"node-resolver-w8m7p\" (UID: \"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec\") " pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:11.968374 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:11.968335 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w8m7p" Apr 16 19:54:12.383283 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:12.383242 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" event={"ID":"af31fba9843cd1e10fca39c5f1285846","Type":"ContainerStarted","Data":"633a6a6397afa4a39018bae6ce5363433459eb966b5a951458089ec32f1df72e"} Apr 16 19:54:12.398290 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:12.398237 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-138.ec2.internal" podStartSLOduration=4.398219611 podStartE2EDuration="4.398219611s" podCreationTimestamp="2026-04-16 19:54:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:12.398017901 +0000 UTC m=+5.673694476" watchObservedRunningTime="2026-04-16 19:54:12.398219611 +0000 UTC m=+5.673896187" Apr 16 19:54:12.847178 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:12.847140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:12.847348 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:12.847191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:12.847348 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:12.847321 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:12.847449 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:12.847378 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.847361204 +0000 UTC m=+10.123037779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:12.847817 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:12.847786 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:12.847817 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:12.847806 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:12.847817 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:12.847816 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ffszx for pod openshift-network-diagnostics/network-check-target-f7rhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:12.848010 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:12.847875 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx podName:fba32900-cb28-4cb9-8c67-8874eb5f06ae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.847860951 +0000 UTC m=+10.123537507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ffszx" (UniqueName: "kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx") pod "network-check-target-f7rhh" (UID: "fba32900-cb28-4cb9-8c67-8874eb5f06ae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:13.323103 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:13.323056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:13.323284 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:13.323195 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:13.323612 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:13.323585 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:13.323742 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:13.323694 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:15.323045 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:15.323003 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:15.323444 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:15.323160 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:15.323686 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:15.323659 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:15.323791 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:15.323762 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:16.879220 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:16.879184 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:16.879238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:16.879366 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:16.879392 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:16.879406 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ffszx for pod openshift-network-diagnostics/network-check-target-f7rhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:16.879373 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:16.879463 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx podName:fba32900-cb28-4cb9-8c67-8874eb5f06ae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:24.879445161 +0000 UTC m=+18.155121715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ffszx" (UniqueName: "kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx") pod "network-check-target-f7rhh" (UID: "fba32900-cb28-4cb9-8c67-8874eb5f06ae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:16.879775 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:16.879509 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:24.879489084 +0000 UTC m=+18.155165655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:17.324164 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:17.324131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:17.324312 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:17.324244 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:17.324912 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:17.324723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:17.324912 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:17.324869 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:19.323140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:19.323038 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:19.323140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:19.323099 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:19.323688 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:19.323193 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:19.323688 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:19.323331 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:21.322820 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:21.322163 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:21.322820 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:21.322296 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:21.322820 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:21.322360 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:21.322820 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:21.322473 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:23.322831 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:23.322794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:23.322831 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:23.322816 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:23.323396 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:23.322923 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:23.323396 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:23.323256 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:24.937522 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:24.937486 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:24.937531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:24.937641 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:24.937657 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:24.937674 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:24.937683 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ffszx for pod openshift-network-diagnostics/network-check-target-f7rhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:24.937697 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:40.937678092 +0000 UTC m=+34.213354648 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:24.937913 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:24.937724 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx podName:fba32900-cb28-4cb9-8c67-8874eb5f06ae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:40.9377114 +0000 UTC m=+34.213387954 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ffszx" (UniqueName: "kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx") pod "network-check-target-f7rhh" (UID: "fba32900-cb28-4cb9-8c67-8874eb5f06ae") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:25.322410 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:25.322314 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:25.322410 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:25.322329 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:25.322634 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:25.322479 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:25.322690 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:25.322641 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:27.066562 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:27.066531 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f35a04_1a02_4c6d_86fb_0b68fcd8fbec.slice/crio-9d3bfe87e6741976a68b2f6e0c700652fef4ea9c076ca7f35f3fca4bc9e760f0 WatchSource:0}: Error finding container 9d3bfe87e6741976a68b2f6e0c700652fef4ea9c076ca7f35f3fca4bc9e760f0: Status 404 returned error can't find the container with id 9d3bfe87e6741976a68b2f6e0c700652fef4ea9c076ca7f35f3fca4bc9e760f0 Apr 16 19:54:27.323019 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.322780 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:27.324551 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:27.323959 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:27.324551 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.324402 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:27.324551 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:27.324507 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:27.407688 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.407638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qldv9" event={"ID":"f261d50e-6c86-49ca-ad32-2c77ac5ecb6a","Type":"ContainerStarted","Data":"ebf28a58360bcfa3c055acb035385ed85b28f62d2fc9619142cf4777124ae2f8"} Apr 16 19:54:27.409866 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.409658 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zh765" event={"ID":"1a8e9f53-a312-4d09-93df-0fd1a68610ff","Type":"ContainerStarted","Data":"d11966bacd5e5bc8a22b29a6e35f81bab2a71aa15f26ca7a5993b86c619f3aec"} Apr 16 19:54:27.411634 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.411469 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w8m7p" event={"ID":"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec","Type":"ContainerStarted","Data":"9d3bfe87e6741976a68b2f6e0c700652fef4ea9c076ca7f35f3fca4bc9e760f0"} Apr 16 19:54:27.413154 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.412807 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lr7sw" event={"ID":"e13fe6ed-e68d-4328-9562-990f76414842","Type":"ContainerStarted","Data":"bb095d64041bd928ceb365876fb3f9cdd2f258036c651f16f07cc2e4b1d7f1f3"} Apr 16 19:54:27.414041 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.414022 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerStarted","Data":"0d276e0b68f965fcced922132a00c4ce46b0296306d26387a5da2cfaa228e109"} Apr 16 19:54:27.417003 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.416019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xz5br" event={"ID":"5f32c181-e6f9-4aa8-b370-e213007636e9","Type":"ContainerStarted","Data":"7c4e16b65f18ab386f3449ecc8a52506d5f33ed0d084e942f061717072c71008"} Apr 16 19:54:27.417745 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.417722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"abc81a061ca1d818c8433e69f23e540bf832f6f1f4b30cb3fff90e03af7c86f6"} Apr 16 19:54:27.425799 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.425615 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qldv9" podStartSLOduration=3.195918342 podStartE2EDuration="20.425604065s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.894288889 +0000 UTC m=+3.169965442" lastFinishedPulling="2026-04-16 19:54:27.123974599 +0000 UTC m=+20.399651165" observedRunningTime="2026-04-16 19:54:27.425265714 +0000 UTC m=+20.700942300" watchObservedRunningTime="2026-04-16 19:54:27.425604065 +0000 UTC m=+20.701280637" Apr 16 19:54:27.445519 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.445474 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zh765" podStartSLOduration=3.210792142 podStartE2EDuration="20.445460307s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.889676466 +0000 UTC m=+3.165353020" lastFinishedPulling="2026-04-16 19:54:27.124344618 +0000 UTC m=+20.400021185" observedRunningTime="2026-04-16 19:54:27.445333676 +0000 UTC m=+20.721010250" watchObservedRunningTime="2026-04-16 19:54:27.445460307 +0000 UTC m=+20.721136881" Apr 16 19:54:27.476322 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.476286 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lr7sw" podStartSLOduration=8.061404447 podStartE2EDuration="20.476273647s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.89158642 +0000 UTC m=+3.167262976" lastFinishedPulling="2026-04-16 19:54:22.306455624 +0000 UTC m=+15.582132176" observedRunningTime="2026-04-16 19:54:27.460368133 +0000 UTC m=+20.736044710" watchObservedRunningTime="2026-04-16 19:54:27.476273647 +0000 UTC m=+20.751950222" Apr 16 19:54:27.476441 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:27.476420 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xz5br" podStartSLOduration=3.275224499 podStartE2EDuration="20.476414458s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.885906293 +0000 UTC m=+3.161582846" lastFinishedPulling="2026-04-16 19:54:27.087096251 +0000 UTC m=+20.362772805" observedRunningTime="2026-04-16 19:54:27.476002238 +0000 UTC m=+20.751678813" watchObservedRunningTime="2026-04-16 19:54:27.476414458 +0000 UTC m=+20.752091032" Apr 16 19:54:28.420264 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.420025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w8m7p" event={"ID":"24f35a04-1a02-4c6d-86fb-0b68fcd8fbec","Type":"ContainerStarted","Data":"745c69dd156ce3e40ed496820ff06473e6a20bd3a9b05fabf543621750b47b9a"} Apr 16 19:54:28.421300 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.421268 2567 generic.go:358] "Generic (PLEG): container finished" podID="3753de4d-d4c5-4f6d-a1a3-bb06177d48f5" containerID="0d276e0b68f965fcced922132a00c4ce46b0296306d26387a5da2cfaa228e109" exitCode=0 Apr 16 19:54:28.421411 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.421335 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerDied","Data":"0d276e0b68f965fcced922132a00c4ce46b0296306d26387a5da2cfaa228e109"} Apr 16 19:54:28.422532 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.422512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" event={"ID":"ad01a266-64de-4515-920d-5076e4f40e3f","Type":"ContainerStarted","Data":"4fa1d1c331d3ae99204db2e420f947412f9cde343f89abff88c2d9bdde37ccdb"} Apr 16 19:54:28.424888 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.424871 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 19:54:28.425129 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.425109 2567 generic.go:358] "Generic (PLEG): container finished" podID="be3aeb91-80a8-4720-87d2-6479ec1370fc" containerID="77b092356c094899578ddbe34bf5d4366b363ff68c4251cc161993a8fc40e029" exitCode=1 Apr 16 19:54:28.425240 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.425219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"7dea87a74f000bad182301968af15d27b2ba2b75f77fcf33a5289308e9b1e817"} Apr 16 19:54:28.425300 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.425247 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"3a9c690a5e3a40a73d2d993ff0f281792b53840924513a61d2bf92893ce77af4"} Apr 16 19:54:28.425300 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.425255 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"bd4792b76a0db2e7646744922a39f9dec4208c5893cdb164f5d346f5bfb4a216"} Apr 16 19:54:28.425300 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.425265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"21daf365614c5202bbbcaf64e9fbccba180d13baa41d6c9fa81ad07289b54baf"} Apr 16 19:54:28.425300 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.425272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerDied","Data":"77b092356c094899578ddbe34bf5d4366b363ff68c4251cc161993a8fc40e029"} Apr 16 19:54:28.435298 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.435266 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w8m7p" podStartSLOduration=17.435253972 podStartE2EDuration="17.435253972s" podCreationTimestamp="2026-04-16 19:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:28.434899821 +0000 UTC m=+21.710576395" watchObservedRunningTime="2026-04-16 19:54:28.435253972 +0000 UTC m=+21.710930546" Apr 16 19:54:28.737033 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:28.737008 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:29.281563 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.281445 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:28.737030848Z","UUID":"2fc7c7f1-6f04-4f8c-b4ab-cc1ed9631eb4","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:29.283473 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.283438 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:29.283473 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.283467 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:29.323066 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.323026 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:29.323243 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.323097 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:29.323243 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:29.323174 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:29.323361 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:29.323304 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:29.429342 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.429171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" event={"ID":"ad01a266-64de-4515-920d-5076e4f40e3f","Type":"ContainerStarted","Data":"2bf068794e2510a174df44b06ba6b9b05cce873a46641fff726ad7da1e93043b"} Apr 16 19:54:29.431351 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.431320 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h7tf4" event={"ID":"28cb7eb4-d997-43b5-a1a2-73abb55230e3","Type":"ContainerStarted","Data":"c897cce64bd30f01d824b9350a73e5c744197abb7bca9b488826570447ef01f9"} Apr 16 19:54:29.447724 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:29.447664 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h7tf4" podStartSLOduration=5.248759701 podStartE2EDuration="22.447645949s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.888159687 +0000 UTC m=+3.163836240" lastFinishedPulling="2026-04-16 19:54:27.08704592 +0000 UTC m=+20.362722488" observedRunningTime="2026-04-16 19:54:29.447189287 +0000 UTC m=+22.722865863" watchObservedRunningTime="2026-04-16 19:54:29.447645949 +0000 UTC m=+22.723322525" Apr 16 19:54:30.435289 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:30.435043 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" event={"ID":"ad01a266-64de-4515-920d-5076e4f40e3f","Type":"ContainerStarted","Data":"87915e2ce5da006f3f05c5dbc820902cd1608be8a8f2f8c8c3ccee8b2d6aa0c8"} Apr 16 19:54:30.437931 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:30.437907 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 19:54:30.438341 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:30.438310 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"eddfdd905f84c1b6bcf4ed5920c27dd3490a5deb9925f41d0bc1325e834f9b8a"} Apr 16 19:54:30.455884 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:30.455823 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-58lpk" podStartSLOduration=3.504535188 podStartE2EDuration="23.4558105s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.884450868 +0000 UTC m=+3.160127421" lastFinishedPulling="2026-04-16 19:54:29.83572618 +0000 UTC m=+23.111402733" observedRunningTime="2026-04-16 19:54:30.455346674 +0000 UTC m=+23.731023249" watchObservedRunningTime="2026-04-16 19:54:30.4558105 +0000 UTC m=+23.731487074" Apr 16 19:54:31.322379 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:31.322342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:31.322379 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:31.322369 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:31.322639 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:31.322474 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:31.322639 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:31.322537 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:32.203398 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:32.203367 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:32.204140 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:32.204117 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:32.442110 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:32.442083 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:32.442660 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:32.442637 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lr7sw" Apr 16 19:54:33.322902 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.322868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:33.322902 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.322881 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:33.323527 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:33.322989 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:33.323527 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:33.322986 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:33.445284 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.445248 2567 generic.go:358] "Generic (PLEG): container finished" podID="3753de4d-d4c5-4f6d-a1a3-bb06177d48f5" containerID="2db60e87fbc8243276cb1f669c5f31f5b2f53a13a00f33743f053969a58ea793" exitCode=0 Apr 16 19:54:33.445435 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.445304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerDied","Data":"2db60e87fbc8243276cb1f669c5f31f5b2f53a13a00f33743f053969a58ea793"} Apr 16 19:54:33.448314 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.448296 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 19:54:33.448616 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.448593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"4fd4fb5701d8ee93a95cc26eff1003a715ce7b155d696cef53c8d8945fe114d7"} Apr 16 19:54:33.449031 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.449014 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:33.449117 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.449041 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:33.449177 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.449158 2567 scope.go:117] "RemoveContainer" containerID="77b092356c094899578ddbe34bf5d4366b363ff68c4251cc161993a8fc40e029" Apr 16 19:54:33.464574 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.464552 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:33.466045 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:33.465247 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:34.456787 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.456770 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 19:54:34.457299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.457098 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" event={"ID":"be3aeb91-80a8-4720-87d2-6479ec1370fc","Type":"ContainerStarted","Data":"9d743eb23c06c7d5db480a2141cd94cf0c0f35689e49723bf85f7bd42734fe15"} Apr 16 19:54:34.457299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.457203 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:34.490135 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.490086 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" podStartSLOduration=10.189279901999999 podStartE2EDuration="27.490069009s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.884242407 +0000 UTC m=+3.159918960" lastFinishedPulling="2026-04-16 19:54:27.185031497 +0000 UTC m=+20.460708067" observedRunningTime="2026-04-16 19:54:34.489711789 +0000 UTC m=+27.765388364" watchObservedRunningTime="2026-04-16 19:54:34.490069009 +0000 UTC m=+27.765745586" Apr 16 19:54:34.802261 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.802232 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x5ml5"] Apr 16 19:54:34.802460 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.802350 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:34.802460 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:34.802438 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:34.804979 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.804946 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f7rhh"] Apr 16 19:54:34.805195 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:34.805057 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:34.805195 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:34.805149 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:35.460972 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:35.460938 2567 generic.go:358] "Generic (PLEG): container finished" podID="3753de4d-d4c5-4f6d-a1a3-bb06177d48f5" containerID="1a389050d65535f2f59b39997d92b15ccc8fce27ae1ebae1284512bfcd9eac8a" exitCode=0 Apr 16 19:54:35.461344 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:35.461021 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerDied","Data":"1a389050d65535f2f59b39997d92b15ccc8fce27ae1ebae1284512bfcd9eac8a"} Apr 16 19:54:35.461344 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:35.461142 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:36.322322 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:36.322135 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:36.322459 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:36.322134 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:36.322459 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:36.322407 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:36.322559 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:36.322463 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:37.467617 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:37.467577 2567 generic.go:358] "Generic (PLEG): container finished" podID="3753de4d-d4c5-4f6d-a1a3-bb06177d48f5" containerID="e2257acb0987084f600f3028df7b000066eeeed31dfcec7ed7d06032c4484c0e" exitCode=0 Apr 16 19:54:37.468103 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:37.467652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerDied","Data":"e2257acb0987084f600f3028df7b000066eeeed31dfcec7ed7d06032c4484c0e"} Apr 16 19:54:37.544293 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:37.544267 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:54:37.544485 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:37.544473 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:37.554547 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:37.554499 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" podUID="be3aeb91-80a8-4720-87d2-6479ec1370fc" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 19:54:37.563523 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:37.563498 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" podUID="be3aeb91-80a8-4720-87d2-6479ec1370fc" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 19:54:38.323035 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:38.323004 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:38.323205 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:38.322990 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:38.323205 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:38.323168 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x5ml5" podUID="f4789667-3ad6-413b-9c9e-a072e7b79d5d" Apr 16 19:54:38.323298 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:38.323279 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f7rhh" podUID="fba32900-cb28-4cb9-8c67-8874eb5f06ae" Apr 16 19:54:39.278509 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:39.278481 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w8m7p_24f35a04-1a02-4c6d-86fb-0b68fcd8fbec/dns-node-resolver/0.log" Apr 16 19:54:40.056285 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.053867 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-138.ec2.internal" event="NodeReady" Apr 16 19:54:40.056285 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.054119 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:40.323023 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.322986 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:40.323637 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.322986 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:40.325782 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.325758 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:40.326423 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.326379 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nnp77\"" Apr 16 19:54:40.326423 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.326390 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:40.326423 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.326391 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:40.326611 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.326399 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qj6k2\"" Apr 16 19:54:40.462078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.462051 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xz5br_5f32c181-e6f9-4aa8-b370-e213007636e9/node-ca/0.log" Apr 16 19:54:40.957532 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.957500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:40.957725 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.957551 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:54:40.957725 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:40.957690 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 19:54:40.957874 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:54:40.957770 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs podName:f4789667-3ad6-413b-9c9e-a072e7b79d5d nodeName:}" failed. No retries permitted until 2026-04-16 19:55:12.957749251 +0000 UTC m=+66.233425807 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs") pod "network-metrics-daemon-x5ml5" (UID: "f4789667-3ad6-413b-9c9e-a072e7b79d5d") : secret "metrics-daemon-secret" not found Apr 16 19:54:40.963589 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:40.963565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffszx\" (UniqueName: \"kubernetes.io/projected/fba32900-cb28-4cb9-8c67-8874eb5f06ae-kube-api-access-ffszx\") pod \"network-check-target-f7rhh\" (UID: \"fba32900-cb28-4cb9-8c67-8874eb5f06ae\") " pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:41.241420 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:41.241340 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:42.979554 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:42.979404 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f7rhh"] Apr 16 19:54:42.983494 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:54:42.983464 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba32900_cb28_4cb9_8c67_8874eb5f06ae.slice/crio-94347a7f533c48845ead7a11b67aca6cbf6d8d4ef1d891a28f0208bcbc848ee6 WatchSource:0}: Error finding container 94347a7f533c48845ead7a11b67aca6cbf6d8d4ef1d891a28f0208bcbc848ee6: Status 404 returned error can't find the container with id 94347a7f533c48845ead7a11b67aca6cbf6d8d4ef1d891a28f0208bcbc848ee6 Apr 16 19:54:43.480718 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:43.480682 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f7rhh" event={"ID":"fba32900-cb28-4cb9-8c67-8874eb5f06ae","Type":"ContainerStarted","Data":"94347a7f533c48845ead7a11b67aca6cbf6d8d4ef1d891a28f0208bcbc848ee6"} Apr 16 19:54:43.483437 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:43.483414 2567 generic.go:358] "Generic (PLEG): container finished" podID="3753de4d-d4c5-4f6d-a1a3-bb06177d48f5" containerID="05a4aa9633935ec9ea5810d74ba43f8074f5e61c10f3f398ee444525b50746ec" exitCode=0 Apr 16 19:54:43.483602 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:43.483452 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerDied","Data":"05a4aa9633935ec9ea5810d74ba43f8074f5e61c10f3f398ee444525b50746ec"} Apr 16 19:54:44.491110 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:44.491073 2567 generic.go:358] "Generic (PLEG): container finished" podID="3753de4d-d4c5-4f6d-a1a3-bb06177d48f5" containerID="61cb833583eaa306cd7718f91b65aff1c4527aaac5bbad2d53beec67700581ca" exitCode=0 Apr 16 19:54:44.491741 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:44.491122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerDied","Data":"61cb833583eaa306cd7718f91b65aff1c4527aaac5bbad2d53beec67700581ca"} Apr 16 19:54:45.496680 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:45.496446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" event={"ID":"3753de4d-d4c5-4f6d-a1a3-bb06177d48f5","Type":"ContainerStarted","Data":"0814c9c3842b3c6d8d27f9df5a5204545f1c2565a7a59ee1c6ae6c7688957cbc"} Apr 16 19:54:46.500124 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:46.500089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f7rhh" event={"ID":"fba32900-cb28-4cb9-8c67-8874eb5f06ae","Type":"ContainerStarted","Data":"de816a73bff4642105800ea3b072542844fa36b10055fce0e74dd5487e811491"} Apr 16 19:54:46.500515 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:46.500309 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:54:46.519156 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:46.519117 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f7hpw" podStartSLOduration=6.24020195 podStartE2EDuration="39.519105314s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:09.89401137 +0000 UTC m=+3.169687939" lastFinishedPulling="2026-04-16 19:54:43.172914749 +0000 UTC m=+36.448591303" observedRunningTime="2026-04-16 19:54:45.540262864 +0000 UTC m=+38.815939436" watchObservedRunningTime="2026-04-16 19:54:46.519105314 +0000 UTC m=+39.794781886" Apr 16 19:54:46.519675 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:54:46.519644 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-f7rhh" podStartSLOduration=36.805117055 podStartE2EDuration="39.519634274s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:42.985258113 +0000 UTC m=+36.260934666" lastFinishedPulling="2026-04-16 19:54:45.699775329 +0000 UTC m=+38.975451885" observedRunningTime="2026-04-16 19:54:46.518591915 +0000 UTC m=+39.794268490" watchObservedRunningTime="2026-04-16 19:54:46.519634274 +0000 UTC m=+39.795310846" Apr 16 19:55:03.147930 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.147889 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5"] Apr 16 19:55:03.192968 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.192942 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7"] Apr 16 19:55:03.193116 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.193093 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.195954 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.195934 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:55:03.196123 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.195996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 19:55:03.196346 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.196328 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:55:03.199032 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.199015 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:55:03.207356 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.207339 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9"] Apr 16 19:55:03.207480 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.207466 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.211739 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.211720 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9lnzx\"" Apr 16 19:55:03.212366 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.212350 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 19:55:03.219199 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.219177 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7"] Apr 16 19:55:03.219199 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.219200 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5"] Apr 16 19:55:03.219310 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.219208 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9"] Apr 16 19:55:03.219310 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.219273 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.221184 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.221159 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 19:55:03.221270 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.221167 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 19:55:03.221709 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.221695 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 19:55:03.221797 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.221782 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 19:55:03.253943 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.253917 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9h7rg"] Apr 16 19:55:03.285989 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.285971 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ngw62"] Apr 16 19:55:03.286087 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.286072 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.287919 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.287902 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-668h7\"" Apr 16 19:55:03.288012 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.287999 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:55:03.288318 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.288306 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:55:03.299280 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.299382 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299297 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2375fe89-b353-4e12-9bfd-8baae7811d1d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.299382 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299326 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twn2p\" (UniqueName: \"kubernetes.io/projected/896262b8-9bfe-4660-a56b-3e90d052848e-kube-api-access-twn2p\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.299382 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299351 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fa0c7489-a6b1-416f-a3dd-1fea62a7ff84-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68bff8456f-nfzv7\" (UID: \"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.299382 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-ca\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.299587 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twdtv\" (UniqueName: \"kubernetes.io/projected/2375fe89-b353-4e12-9bfd-8baae7811d1d-kube-api-access-twdtv\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.299587 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299431 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.299587 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299465 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/896262b8-9bfe-4660-a56b-3e90d052848e-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.299587 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-hub\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.299587 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mxmd\" (UniqueName: \"kubernetes.io/projected/fa0c7489-a6b1-416f-a3dd-1fea62a7ff84-kube-api-access-5mxmd\") pod \"managed-serviceaccount-addon-agent-68bff8456f-nfzv7\" (UID: \"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.299587 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.299558 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/896262b8-9bfe-4660-a56b-3e90d052848e-tmp\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.316532 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.316503 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9h7rg"] Apr 16 19:55:03.316532 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.316530 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ngw62"] Apr 16 19:55:03.316658 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.316630 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.318428 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.318407 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:55:03.318509 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.318472 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:55:03.318677 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.318659 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:55:03.318781 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.318767 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfkbf\"" Apr 16 19:55:03.373979 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.373949 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gsnhj"] Apr 16 19:55:03.399281 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399229 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gsnhj"] Apr 16 19:55:03.399380 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399338 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.399797 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2375fe89-b353-4e12-9bfd-8baae7811d1d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.399909 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399808 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-config-volume\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.399909 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399825 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c3453a-e123-4e53-b238-f0fd985f362c-cert\") pod \"ingress-canary-ngw62\" (UID: \"f7c3453a-e123-4e53-b238-f0fd985f362c\") " pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.399909 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twn2p\" (UniqueName: \"kubernetes.io/projected/896262b8-9bfe-4660-a56b-3e90d052848e-kube-api-access-twn2p\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.399909 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fa0c7489-a6b1-416f-a3dd-1fea62a7ff84-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68bff8456f-nfzv7\" (UID: \"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.400108 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-ca\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.400108 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.399983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twdtv\" (UniqueName: \"kubernetes.io/projected/2375fe89-b353-4e12-9bfd-8baae7811d1d-kube-api-access-twdtv\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.400108 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmpg\" (UniqueName: \"kubernetes.io/projected/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-kube-api-access-5rmpg\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.400108 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400046 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.400108 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/896262b8-9bfe-4660-a56b-3e90d052848e-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.400345 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-hub\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.400399 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400385 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjxz\" (UniqueName: \"kubernetes.io/projected/f7c3453a-e123-4e53-b238-f0fd985f362c-kube-api-access-cdjxz\") pod \"ingress-canary-ngw62\" (UID: \"f7c3453a-e123-4e53-b238-f0fd985f362c\") " pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.400458 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mxmd\" (UniqueName: \"kubernetes.io/projected/fa0c7489-a6b1-416f-a3dd-1fea62a7ff84-kube-api-access-5mxmd\") pod \"managed-serviceaccount-addon-agent-68bff8456f-nfzv7\" (UID: \"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.400514 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400478 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/896262b8-9bfe-4660-a56b-3e90d052848e-tmp\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.400514 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400509 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-metrics-tls\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.400626 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400534 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-tmp-dir\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.400626 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.401119 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.400938 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/896262b8-9bfe-4660-a56b-3e90d052848e-tmp\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.401611 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.401584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2375fe89-b353-4e12-9bfd-8baae7811d1d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.404123 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404101 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/896262b8-9bfe-4660-a56b-3e90d052848e-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.404222 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404122 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-ca\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.404222 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404173 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:55:03.404333 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404245 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:55:03.404333 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404250 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-hub\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.404333 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404261 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.404500 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2375fe89-b353-4e12-9bfd-8baae7811d1d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.404500 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404391 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pxnkg\"" Apr 16 19:55:03.404500 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404426 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:55:03.404500 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404438 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:55:03.404682 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.404634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fa0c7489-a6b1-416f-a3dd-1fea62a7ff84-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68bff8456f-nfzv7\" (UID: \"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.412348 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.412327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twn2p\" (UniqueName: \"kubernetes.io/projected/896262b8-9bfe-4660-a56b-3e90d052848e-kube-api-access-twn2p\") pod \"klusterlet-addon-workmgr-5f68c6c6c4-jp9r5\" (UID: \"896262b8-9bfe-4660-a56b-3e90d052848e\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.412653 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.412633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twdtv\" (UniqueName: \"kubernetes.io/projected/2375fe89-b353-4e12-9bfd-8baae7811d1d-kube-api-access-twdtv\") pod \"cluster-proxy-proxy-agent-56c79bdfd7-q77l9\" (UID: \"2375fe89-b353-4e12-9bfd-8baae7811d1d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.413500 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.413483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mxmd\" (UniqueName: \"kubernetes.io/projected/fa0c7489-a6b1-416f-a3dd-1fea62a7ff84-kube-api-access-5mxmd\") pod \"managed-serviceaccount-addon-agent-68bff8456f-nfzv7\" (UID: \"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.501541 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501518 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c562fdae-037e-4053-be61-6f0b8eb48c63-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.501695 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501569 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-config-volume\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.501695 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c3453a-e123-4e53-b238-f0fd985f362c-cert\") pod \"ingress-canary-ngw62\" (UID: \"f7c3453a-e123-4e53-b238-f0fd985f362c\") " pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.501695 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501616 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmpg\" (UniqueName: \"kubernetes.io/projected/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-kube-api-access-5rmpg\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.501695 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501639 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c562fdae-037e-4053-be61-6f0b8eb48c63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.501695 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501657 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptq2m\" (UniqueName: \"kubernetes.io/projected/c562fdae-037e-4053-be61-6f0b8eb48c63-kube-api-access-ptq2m\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.501974 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjxz\" (UniqueName: \"kubernetes.io/projected/f7c3453a-e123-4e53-b238-f0fd985f362c-kube-api-access-cdjxz\") pod \"ingress-canary-ngw62\" (UID: \"f7c3453a-e123-4e53-b238-f0fd985f362c\") " pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.501974 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c562fdae-037e-4053-be61-6f0b8eb48c63-crio-socket\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.501974 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-metrics-tls\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.501974 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501903 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:03.501974 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-tmp-dir\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.501974 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.501950 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c562fdae-037e-4053-be61-6f0b8eb48c63-data-volume\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.502620 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.502106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-config-volume\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.502620 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.502264 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-tmp-dir\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.504144 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.504124 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c3453a-e123-4e53-b238-f0fd985f362c-cert\") pod \"ingress-canary-ngw62\" (UID: \"f7c3453a-e123-4e53-b238-f0fd985f362c\") " pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.504312 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.504291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-metrics-tls\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.523159 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.522997 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjxz\" (UniqueName: \"kubernetes.io/projected/f7c3453a-e123-4e53-b238-f0fd985f362c-kube-api-access-cdjxz\") pod \"ingress-canary-ngw62\" (UID: \"f7c3453a-e123-4e53-b238-f0fd985f362c\") " pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.533146 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.533081 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" Apr 16 19:55:03.535880 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.535391 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" Apr 16 19:55:03.536233 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.536203 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmpg\" (UniqueName: \"kubernetes.io/projected/83b13a4d-812f-4c1c-b1cd-cb6c294c587f-kube-api-access-5rmpg\") pod \"dns-default-9h7rg\" (UID: \"83b13a4d-812f-4c1c-b1cd-cb6c294c587f\") " pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.603743 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.603157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:03.604257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.603851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c562fdae-037e-4053-be61-6f0b8eb48c63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.604257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.603888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptq2m\" (UniqueName: \"kubernetes.io/projected/c562fdae-037e-4053-be61-6f0b8eb48c63-kube-api-access-ptq2m\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.604257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.603922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c562fdae-037e-4053-be61-6f0b8eb48c63-crio-socket\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.604257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.603956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c562fdae-037e-4053-be61-6f0b8eb48c63-data-volume\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.604257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.603985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c562fdae-037e-4053-be61-6f0b8eb48c63-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.604906 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.604563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c562fdae-037e-4053-be61-6f0b8eb48c63-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.605031 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.605010 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c562fdae-037e-4053-be61-6f0b8eb48c63-crio-socket\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.608773 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.605577 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c562fdae-037e-4053-be61-6f0b8eb48c63-data-volume\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.609027 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.608790 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c562fdae-037e-4053-be61-6f0b8eb48c63-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.616133 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.616076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptq2m\" (UniqueName: \"kubernetes.io/projected/c562fdae-037e-4053-be61-6f0b8eb48c63-kube-api-access-ptq2m\") pod \"insights-runtime-extractor-gsnhj\" (UID: \"c562fdae-037e-4053-be61-6f0b8eb48c63\") " pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.625322 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.625053 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ngw62" Apr 16 19:55:03.694420 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.694216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5"] Apr 16 19:55:03.694981 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.694954 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7"] Apr 16 19:55:03.698275 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:03.698250 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0c7489_a6b1_416f_a3dd_1fea62a7ff84.slice/crio-07f254b952e4e4865852195d2f92a03670b3f17cee2431c5b5cc1d7dee3c7a8a WatchSource:0}: Error finding container 07f254b952e4e4865852195d2f92a03670b3f17cee2431c5b5cc1d7dee3c7a8a: Status 404 returned error can't find the container with id 07f254b952e4e4865852195d2f92a03670b3f17cee2431c5b5cc1d7dee3c7a8a Apr 16 19:55:03.711956 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.711751 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9"] Apr 16 19:55:03.714323 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:03.714230 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2375fe89_b353_4e12_9bfd_8baae7811d1d.slice/crio-6048ab17b5e85d7ba45b155d5bbf168dc27b694e870d187f8be4edc23a89ff9f WatchSource:0}: Error finding container 6048ab17b5e85d7ba45b155d5bbf168dc27b694e870d187f8be4edc23a89ff9f: Status 404 returned error can't find the container with id 6048ab17b5e85d7ba45b155d5bbf168dc27b694e870d187f8be4edc23a89ff9f Apr 16 19:55:03.725735 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.725711 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gsnhj" Apr 16 19:55:03.760639 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.760613 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9h7rg"] Apr 16 19:55:03.763695 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:03.762817 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b13a4d_812f_4c1c_b1cd_cb6c294c587f.slice/crio-6fb03c0a67fdefebc38fbd0a03bb3d1cf859d39a2d31bfdd80180b182de92eb2 WatchSource:0}: Error finding container 6fb03c0a67fdefebc38fbd0a03bb3d1cf859d39a2d31bfdd80180b182de92eb2: Status 404 returned error can't find the container with id 6fb03c0a67fdefebc38fbd0a03bb3d1cf859d39a2d31bfdd80180b182de92eb2 Apr 16 19:55:03.775987 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.775961 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ngw62"] Apr 16 19:55:03.781728 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:03.781700 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c3453a_e123_4e53_b238_f0fd985f362c.slice/crio-11760fb876b9ee59177fdebe4bb79cb6d9c79bbd83f8fbfb0ed69af33e71ba9e WatchSource:0}: Error finding container 11760fb876b9ee59177fdebe4bb79cb6d9c79bbd83f8fbfb0ed69af33e71ba9e: Status 404 returned error can't find the container with id 11760fb876b9ee59177fdebe4bb79cb6d9c79bbd83f8fbfb0ed69af33e71ba9e Apr 16 19:55:03.921489 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:03.921459 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gsnhj"] Apr 16 19:55:03.924452 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:03.924421 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc562fdae_037e_4053_be61_6f0b8eb48c63.slice/crio-8e487b444c2b8713c19ade8557cc6d9849b1892e319941ada12e2ae13bb59018 WatchSource:0}: Error finding container 8e487b444c2b8713c19ade8557cc6d9849b1892e319941ada12e2ae13bb59018: Status 404 returned error can't find the container with id 8e487b444c2b8713c19ade8557cc6d9849b1892e319941ada12e2ae13bb59018 Apr 16 19:55:04.544563 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.544521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" event={"ID":"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84","Type":"ContainerStarted","Data":"07f254b952e4e4865852195d2f92a03670b3f17cee2431c5b5cc1d7dee3c7a8a"} Apr 16 19:55:04.548262 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.548235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gsnhj" event={"ID":"c562fdae-037e-4053-be61-6f0b8eb48c63","Type":"ContainerStarted","Data":"bd848abb75a3b5322fcd5270059f58de38f15036135d67e3a56a257c7011c1ce"} Apr 16 19:55:04.548386 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.548273 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gsnhj" event={"ID":"c562fdae-037e-4053-be61-6f0b8eb48c63","Type":"ContainerStarted","Data":"8e487b444c2b8713c19ade8557cc6d9849b1892e319941ada12e2ae13bb59018"} Apr 16 19:55:04.551736 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.551663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ngw62" event={"ID":"f7c3453a-e123-4e53-b238-f0fd985f362c","Type":"ContainerStarted","Data":"11760fb876b9ee59177fdebe4bb79cb6d9c79bbd83f8fbfb0ed69af33e71ba9e"} Apr 16 19:55:04.554698 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.554674 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" event={"ID":"2375fe89-b353-4e12-9bfd-8baae7811d1d","Type":"ContainerStarted","Data":"6048ab17b5e85d7ba45b155d5bbf168dc27b694e870d187f8be4edc23a89ff9f"} Apr 16 19:55:04.556695 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.556670 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" event={"ID":"896262b8-9bfe-4660-a56b-3e90d052848e","Type":"ContainerStarted","Data":"d982263255e28094da4e6672f382a6bd54eaa9154899e1dec2b8d60bd2a441e9"} Apr 16 19:55:04.559864 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:04.559824 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h7rg" event={"ID":"83b13a4d-812f-4c1c-b1cd-cb6c294c587f","Type":"ContainerStarted","Data":"6fb03c0a67fdefebc38fbd0a03bb3d1cf859d39a2d31bfdd80180b182de92eb2"} Apr 16 19:55:06.567363 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:06.567302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gsnhj" event={"ID":"c562fdae-037e-4053-be61-6f0b8eb48c63","Type":"ContainerStarted","Data":"87fe4b77cd223d8e5dc279c639d273efcbdc3bec1a059a641b2aadf6ab3a8f0d"} Apr 16 19:55:07.565299 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:07.565273 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j56zc" Apr 16 19:55:11.582543 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.582501 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gsnhj" event={"ID":"c562fdae-037e-4053-be61-6f0b8eb48c63","Type":"ContainerStarted","Data":"1785d83da7b12efbcde4f0ce022c24b133ec49a71081fb1b968576e9bfec1f0f"} Apr 16 19:55:11.583799 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.583775 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ngw62" event={"ID":"f7c3453a-e123-4e53-b238-f0fd985f362c","Type":"ContainerStarted","Data":"a834d3fa7450f5a72e3323b1497f67240f1354c5f113b546af3ddaa3d98ea974"} Apr 16 19:55:11.585021 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.584998 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" event={"ID":"2375fe89-b353-4e12-9bfd-8baae7811d1d","Type":"ContainerStarted","Data":"d782e6a1529a02d18ec3450c9ad56df7e87155aeb07277985753a15024b26167"} Apr 16 19:55:11.586296 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.586268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" event={"ID":"896262b8-9bfe-4660-a56b-3e90d052848e","Type":"ContainerStarted","Data":"edbcbd9769282367f36433cf77c07701510f81a77faa5ee1706c973587c5899b"} Apr 16 19:55:11.586474 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.586445 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:11.588021 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.587995 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h7rg" event={"ID":"83b13a4d-812f-4c1c-b1cd-cb6c294c587f","Type":"ContainerStarted","Data":"1d16d5a4009a736cec0c158de4a2dc1b9a6d20c3a4bb0750c6a511d2130511e3"} Apr 16 19:55:11.588021 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.588020 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h7rg" event={"ID":"83b13a4d-812f-4c1c-b1cd-cb6c294c587f","Type":"ContainerStarted","Data":"057dca8f04f1c2d507bacce276e93fd7270f8a09d894494eac91c36f4a8eddb0"} Apr 16 19:55:11.588229 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.588115 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:11.588335 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.588321 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" Apr 16 19:55:11.589252 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.589232 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" event={"ID":"fa0c7489-a6b1-416f-a3dd-1fea62a7ff84","Type":"ContainerStarted","Data":"0627d928d58a62c04017f34c7156a802910d170650546e37349ba3467404a1cc"} Apr 16 19:55:11.601944 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.601908 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gsnhj" podStartSLOduration=1.511626813 podStartE2EDuration="8.601896886s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:04.055485798 +0000 UTC m=+57.331162351" lastFinishedPulling="2026-04-16 19:55:11.145755868 +0000 UTC m=+64.421432424" observedRunningTime="2026-04-16 19:55:11.601070843 +0000 UTC m=+64.876747419" watchObservedRunningTime="2026-04-16 19:55:11.601896886 +0000 UTC m=+64.877573460" Apr 16 19:55:11.615171 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.615132 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bff8456f-nfzv7" podStartSLOduration=1.173155335 podStartE2EDuration="8.615121662s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.702609585 +0000 UTC m=+56.978286157" lastFinishedPulling="2026-04-16 19:55:11.144575928 +0000 UTC m=+64.420252484" observedRunningTime="2026-04-16 19:55:11.61464104 +0000 UTC m=+64.890317616" watchObservedRunningTime="2026-04-16 19:55:11.615121662 +0000 UTC m=+64.890798242" Apr 16 19:55:11.633252 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.633209 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9h7rg" podStartSLOduration=1.765513001 podStartE2EDuration="8.633196598s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.765281493 +0000 UTC m=+57.040958046" lastFinishedPulling="2026-04-16 19:55:10.632965088 +0000 UTC m=+63.908641643" observedRunningTime="2026-04-16 19:55:11.632908318 +0000 UTC m=+64.908584892" watchObservedRunningTime="2026-04-16 19:55:11.633196598 +0000 UTC m=+64.908873174" Apr 16 19:55:11.649754 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.649688 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f68c6c6c4-jp9r5" podStartSLOduration=1.211733448 podStartE2EDuration="8.649678066s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.700095262 +0000 UTC m=+56.975771818" lastFinishedPulling="2026-04-16 19:55:11.138039883 +0000 UTC m=+64.413716436" observedRunningTime="2026-04-16 19:55:11.648779898 +0000 UTC m=+64.924456473" watchObservedRunningTime="2026-04-16 19:55:11.649678066 +0000 UTC m=+64.925354640" Apr 16 19:55:11.665031 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:11.664998 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ngw62" podStartSLOduration=1.319783299 podStartE2EDuration="8.664987703s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.785289036 +0000 UTC m=+57.060965590" lastFinishedPulling="2026-04-16 19:55:11.130493423 +0000 UTC m=+64.406169994" observedRunningTime="2026-04-16 19:55:11.66437121 +0000 UTC m=+64.940047785" watchObservedRunningTime="2026-04-16 19:55:11.664987703 +0000 UTC m=+64.940664279" Apr 16 19:55:12.981277 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:12.981251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:55:12.983476 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:12.983450 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4789667-3ad6-413b-9c9e-a072e7b79d5d-metrics-certs\") pod \"network-metrics-daemon-x5ml5\" (UID: \"f4789667-3ad6-413b-9c9e-a072e7b79d5d\") " pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:55:13.037813 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.037779 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nnp77\"" Apr 16 19:55:13.046621 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.046597 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x5ml5" Apr 16 19:55:13.160343 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.160283 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x5ml5"] Apr 16 19:55:13.163201 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:13.163173 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4789667_3ad6_413b_9c9e_a072e7b79d5d.slice/crio-6d7f4140776d5d0d054bcb2255594e8f93c7e3ea668eeb14daab5985d5064dd3 WatchSource:0}: Error finding container 6d7f4140776d5d0d054bcb2255594e8f93c7e3ea668eeb14daab5985d5064dd3: Status 404 returned error can't find the container with id 6d7f4140776d5d0d054bcb2255594e8f93c7e3ea668eeb14daab5985d5064dd3 Apr 16 19:55:13.600663 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.600620 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" event={"ID":"2375fe89-b353-4e12-9bfd-8baae7811d1d","Type":"ContainerStarted","Data":"214c6fae14769528fae4b787f9a8f4f3402c5cc1683096a8b0a3d7cddaca2e27"} Apr 16 19:55:13.600663 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.600663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" event={"ID":"2375fe89-b353-4e12-9bfd-8baae7811d1d","Type":"ContainerStarted","Data":"780bbd037c31549a9323e2fd835a0aada003c30b7f8394cf9c5fe09b171ae744"} Apr 16 19:55:13.601781 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.601745 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x5ml5" event={"ID":"f4789667-3ad6-413b-9c9e-a072e7b79d5d","Type":"ContainerStarted","Data":"6d7f4140776d5d0d054bcb2255594e8f93c7e3ea668eeb14daab5985d5064dd3"} Apr 16 19:55:13.618736 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:13.618693 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c79bdfd7-q77l9" podStartSLOduration=1.455155335 podStartE2EDuration="10.618682035s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.716670869 +0000 UTC m=+56.992347437" lastFinishedPulling="2026-04-16 19:55:12.88019758 +0000 UTC m=+66.155874137" observedRunningTime="2026-04-16 19:55:13.617981005 +0000 UTC m=+66.893657582" watchObservedRunningTime="2026-04-16 19:55:13.618682035 +0000 UTC m=+66.894358609" Apr 16 19:55:14.607110 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.607073 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x5ml5" event={"ID":"f4789667-3ad6-413b-9c9e-a072e7b79d5d","Type":"ContainerStarted","Data":"b030e1b2897952e2cebaea67443a90ed686f066c5776587892d05f0dd0b05cb9"} Apr 16 19:55:14.607471 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.607118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x5ml5" event={"ID":"f4789667-3ad6-413b-9c9e-a072e7b79d5d","Type":"ContainerStarted","Data":"e6921689ce04fe3bb0eba45da84af9972965ef4120088862e70561868d8287cc"} Apr 16 19:55:14.640147 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.639976 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x5ml5" podStartSLOduration=66.76679579 podStartE2EDuration="1m7.639962383s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:55:13.164998401 +0000 UTC m=+66.440674957" lastFinishedPulling="2026-04-16 19:55:14.038164997 +0000 UTC m=+67.313841550" observedRunningTime="2026-04-16 19:55:14.638984341 +0000 UTC m=+67.914660916" watchObservedRunningTime="2026-04-16 19:55:14.639962383 +0000 UTC m=+67.915638958" Apr 16 19:55:14.661703 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.661670 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9ghpw"] Apr 16 19:55:14.663636 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.663619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.665736 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.665715 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:55:14.665868 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.665718 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:14.666431 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.666415 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:14.666643 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.666630 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:14.666707 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.666659 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8qnsr\"" Apr 16 19:55:14.666767 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.666726 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:14.667115 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.667100 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:55:14.793094 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793068 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/5e98c2cc-e4f6-4adc-a067-f32e188531f9-kube-api-access-fxs5h\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793204 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-accelerators-collector-config\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793204 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793144 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-root\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793306 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e98c2cc-e4f6-4adc-a067-f32e188531f9-metrics-client-ca\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793404 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793388 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-wtmp\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793457 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-textfile\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793510 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-sys\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793564 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-tls\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.793670 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.793642 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.894952 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.894895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-wtmp\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.894952 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.894934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-textfile\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.894952 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.894952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-sys\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895109 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.894973 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-tls\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895109 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895109 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-sys\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895109 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895039 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/5e98c2cc-e4f6-4adc-a067-f32e188531f9-kube-api-access-fxs5h\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895109 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-wtmp\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895109 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-accelerators-collector-config\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895307 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-root\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895307 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e98c2cc-e4f6-4adc-a067-f32e188531f9-root\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895307 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e98c2cc-e4f6-4adc-a067-f32e188531f9-metrics-client-ca\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895307 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895266 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-textfile\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895716 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-accelerators-collector-config\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.895804 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.895724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e98c2cc-e4f6-4adc-a067-f32e188531f9-metrics-client-ca\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.897366 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.897346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.897498 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.897477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e98c2cc-e4f6-4adc-a067-f32e188531f9-node-exporter-tls\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.908253 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.908233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/5e98c2cc-e4f6-4adc-a067-f32e188531f9-kube-api-access-fxs5h\") pod \"node-exporter-9ghpw\" (UID: \"5e98c2cc-e4f6-4adc-a067-f32e188531f9\") " pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.972479 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:14.972453 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9ghpw" Apr 16 19:55:14.980822 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:14.980799 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e98c2cc_e4f6_4adc_a067_f32e188531f9.slice/crio-32d2c8150821640d477e77f1f1e2db1ac8cdb96eb1023dc7767f5591aedfb8a5 WatchSource:0}: Error finding container 32d2c8150821640d477e77f1f1e2db1ac8cdb96eb1023dc7767f5591aedfb8a5: Status 404 returned error can't find the container with id 32d2c8150821640d477e77f1f1e2db1ac8cdb96eb1023dc7767f5591aedfb8a5 Apr 16 19:55:15.611252 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:15.611219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ghpw" event={"ID":"5e98c2cc-e4f6-4adc-a067-f32e188531f9","Type":"ContainerStarted","Data":"32d2c8150821640d477e77f1f1e2db1ac8cdb96eb1023dc7767f5591aedfb8a5"} Apr 16 19:55:16.615261 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:16.615222 2567 generic.go:358] "Generic (PLEG): container finished" podID="5e98c2cc-e4f6-4adc-a067-f32e188531f9" containerID="85ea1dbc6b11a6d23a260b0997d02cdb3f725c87669917599ac24803ac19f295" exitCode=0 Apr 16 19:55:16.615261 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:16.615264 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ghpw" event={"ID":"5e98c2cc-e4f6-4adc-a067-f32e188531f9","Type":"ContainerDied","Data":"85ea1dbc6b11a6d23a260b0997d02cdb3f725c87669917599ac24803ac19f295"} Apr 16 19:55:17.504710 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:17.504679 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-f7rhh" Apr 16 19:55:17.620114 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:17.620084 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ghpw" event={"ID":"5e98c2cc-e4f6-4adc-a067-f32e188531f9","Type":"ContainerStarted","Data":"7e9f2242350d60d21ba97bcfa7b9c44f3365bfd5ac4b9581ea672b8f3ef1ef0e"} Apr 16 19:55:17.620114 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:17.620116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ghpw" event={"ID":"5e98c2cc-e4f6-4adc-a067-f32e188531f9","Type":"ContainerStarted","Data":"1b2a5dd14f4373b30d191cbc71e528a2313782dc0f21a9e3638a7a5751b73dfa"} Apr 16 19:55:17.641970 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:17.641798 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9ghpw" podStartSLOduration=2.987173224 podStartE2EDuration="3.641781791s" podCreationTimestamp="2026-04-16 19:55:14 +0000 UTC" firstStartedPulling="2026-04-16 19:55:14.982531517 +0000 UTC m=+68.258208070" lastFinishedPulling="2026-04-16 19:55:15.637140074 +0000 UTC m=+68.912816637" observedRunningTime="2026-04-16 19:55:17.640779545 +0000 UTC m=+70.916456119" watchObservedRunningTime="2026-04-16 19:55:17.641781791 +0000 UTC m=+70.917458375" Apr 16 19:55:19.023624 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.023586 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5569c8f6b5-h52pg"] Apr 16 19:55:19.026726 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.026704 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.031882 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.031859 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 19:55:19.032056 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.032023 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 19:55:19.032384 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.032367 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:19.032477 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.032412 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-plxg6\"" Apr 16 19:55:19.032477 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.032428 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bdi42hbr68tu2\"" Apr 16 19:55:19.032477 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.032443 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 19:55:19.037645 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.037625 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5569c8f6b5-h52pg"] Apr 16 19:55:19.121045 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121015 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-client-ca-bundle\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.121045 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121047 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-audit-log\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.121176 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121070 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-secret-metrics-server-tls\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.121176 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-secret-metrics-server-client-certs\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.121239 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-metrics-server-audit-profiles\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.121239 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121201 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.121239 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.121231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc4l7\" (UniqueName: \"kubernetes.io/projected/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-kube-api-access-cc4l7\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.221554 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.221531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-secret-metrics-server-tls\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.221662 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.221558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-secret-metrics-server-client-certs\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.221662 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.221595 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-metrics-server-audit-profiles\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.221662 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.221616 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.221662 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.221650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc4l7\" (UniqueName: \"kubernetes.io/projected/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-kube-api-access-cc4l7\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.222230 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.222202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-client-ca-bundle\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.222358 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.222250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-audit-log\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.222822 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.222796 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-audit-log\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.223114 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.223065 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.223279 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.223260 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-metrics-server-audit-profiles\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.224201 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.224182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-secret-metrics-server-client-certs\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.224428 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.224406 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-secret-metrics-server-tls\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.224589 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.224573 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-client-ca-bundle\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.231051 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.231020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc4l7\" (UniqueName: \"kubernetes.io/projected/2deeec4c-a2b4-4d70-9324-1b64013cf1c6-kube-api-access-cc4l7\") pod \"metrics-server-5569c8f6b5-h52pg\" (UID: \"2deeec4c-a2b4-4d70-9324-1b64013cf1c6\") " pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.335257 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.335183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:19.454333 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.454298 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5569c8f6b5-h52pg"] Apr 16 19:55:19.457230 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:55:19.457202 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2deeec4c_a2b4_4d70_9324_1b64013cf1c6.slice/crio-d0d697d6c5b3d4d8433a36dfc34e49d1e7b4665f76d30c6f73bba451b10e2406 WatchSource:0}: Error finding container d0d697d6c5b3d4d8433a36dfc34e49d1e7b4665f76d30c6f73bba451b10e2406: Status 404 returned error can't find the container with id d0d697d6c5b3d4d8433a36dfc34e49d1e7b4665f76d30c6f73bba451b10e2406 Apr 16 19:55:19.627226 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:19.627142 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" event={"ID":"2deeec4c-a2b4-4d70-9324-1b64013cf1c6","Type":"ContainerStarted","Data":"d0d697d6c5b3d4d8433a36dfc34e49d1e7b4665f76d30c6f73bba451b10e2406"} Apr 16 19:55:21.594658 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:21.594627 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9h7rg" Apr 16 19:55:21.635859 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:21.634900 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" event={"ID":"2deeec4c-a2b4-4d70-9324-1b64013cf1c6","Type":"ContainerStarted","Data":"78f9087bb21e782324e514713c9fd1936f8e368480be4ab4c3a472c5cf8f48fc"} Apr 16 19:55:21.654588 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:21.654543 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" podStartSLOduration=1.451841548 podStartE2EDuration="2.654528043s" podCreationTimestamp="2026-04-16 19:55:19 +0000 UTC" firstStartedPulling="2026-04-16 19:55:19.459192315 +0000 UTC m=+72.734868867" lastFinishedPulling="2026-04-16 19:55:20.661878809 +0000 UTC m=+73.937555362" observedRunningTime="2026-04-16 19:55:21.653694541 +0000 UTC m=+74.929371118" watchObservedRunningTime="2026-04-16 19:55:21.654528043 +0000 UTC m=+74.930204617" Apr 16 19:55:39.336035 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:39.336000 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:39.336035 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:39.336043 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:59.340483 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:59.340453 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:55:59.344203 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:55:59.344184 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5569c8f6b5-h52pg" Apr 16 19:58:48.432863 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.432819 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q842n"] Apr 16 19:58:48.435106 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.435090 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.439551 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.439532 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:58:48.445345 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.445325 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q842n"] Apr 16 19:58:48.509370 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.509347 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d3fadc-dade-46f9-9279-3080298ec06b-original-pull-secret\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.509491 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.509388 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d3fadc-dade-46f9-9279-3080298ec06b-dbus\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.509491 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.509409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d3fadc-dade-46f9-9279-3080298ec06b-kubelet-config\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.610329 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.610294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d3fadc-dade-46f9-9279-3080298ec06b-dbus\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.610470 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.610340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d3fadc-dade-46f9-9279-3080298ec06b-kubelet-config\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.610470 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.610399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d3fadc-dade-46f9-9279-3080298ec06b-original-pull-secret\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.610583 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.610477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28d3fadc-dade-46f9-9279-3080298ec06b-dbus\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.610583 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.610492 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28d3fadc-dade-46f9-9279-3080298ec06b-kubelet-config\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.612854 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.612820 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28d3fadc-dade-46f9-9279-3080298ec06b-original-pull-secret\") pod \"global-pull-secret-syncer-q842n\" (UID: \"28d3fadc-dade-46f9-9279-3080298ec06b\") " pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.744448 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.744348 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q842n" Apr 16 19:58:48.885797 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:48.885764 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q842n"] Apr 16 19:58:48.890491 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:58:48.890457 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d3fadc_dade_46f9_9279_3080298ec06b.slice/crio-4d670969a70e87321a620dcf2aa6410d406e087f8723866863a96f01263de728 WatchSource:0}: Error finding container 4d670969a70e87321a620dcf2aa6410d406e087f8723866863a96f01263de728: Status 404 returned error can't find the container with id 4d670969a70e87321a620dcf2aa6410d406e087f8723866863a96f01263de728 Apr 16 19:58:49.140026 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:49.139932 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q842n" event={"ID":"28d3fadc-dade-46f9-9279-3080298ec06b","Type":"ContainerStarted","Data":"4d670969a70e87321a620dcf2aa6410d406e087f8723866863a96f01263de728"} Apr 16 19:58:53.150740 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:53.150704 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q842n" event={"ID":"28d3fadc-dade-46f9-9279-3080298ec06b","Type":"ContainerStarted","Data":"daad6d167190936b4992f5f48ba9918648c37309491b5a4950f0ffc4e65bf082"} Apr 16 19:58:53.167481 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:58:53.167441 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q842n" podStartSLOduration=1.5997814030000002 podStartE2EDuration="5.167425879s" podCreationTimestamp="2026-04-16 19:58:48 +0000 UTC" firstStartedPulling="2026-04-16 19:58:48.892273265 +0000 UTC m=+282.167949819" lastFinishedPulling="2026-04-16 19:58:52.459917743 +0000 UTC m=+285.735594295" observedRunningTime="2026-04-16 19:58:53.165953696 +0000 UTC m=+286.441630270" watchObservedRunningTime="2026-04-16 19:58:53.167425879 +0000 UTC m=+286.443102454" Apr 16 19:59:07.215672 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:07.215636 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 19:59:07.216112 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:07.215817 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 19:59:07.218196 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:07.218178 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:59:33.221415 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.221381 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-772bs"] Apr 16 19:59:33.223505 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.223490 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.225906 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.225883 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 19:59:33.226029 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.225919 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 19:59:33.226029 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.225889 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 19:59:33.226029 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.225942 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 19:59:33.226359 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.226341 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 19:59:33.226437 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.226340 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5b9qt\"" Apr 16 19:59:33.236317 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.236289 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-772bs"] Apr 16 19:59:33.310417 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.310389 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2p8m\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-kube-api-access-k2p8m\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.310528 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.310444 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/51461c0e-37ec-4c78-8fa7-2a135d82854d-cabundle0\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.310528 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.310463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.411710 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.411678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/51461c0e-37ec-4c78-8fa7-2a135d82854d-cabundle0\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.411710 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.411710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.411920 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.411738 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2p8m\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-kube-api-access-k2p8m\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.411920 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.411864 2567 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 19:59:33.411920 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.411882 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:59:33.411920 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.411891 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:59:33.411920 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.411905 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-772bs: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 19:59:33.412083 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.411984 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates podName:51461c0e-37ec-4c78-8fa7-2a135d82854d nodeName:}" failed. No retries permitted until 2026-04-16 19:59:33.911964695 +0000 UTC m=+327.187641263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates") pod "keda-operator-ffbb595cb-772bs" (UID: "51461c0e-37ec-4c78-8fa7-2a135d82854d") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 19:59:33.412281 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.412263 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/51461c0e-37ec-4c78-8fa7-2a135d82854d-cabundle0\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.436531 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.436500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2p8m\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-kube-api-access-k2p8m\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.915078 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:33.914963 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:33.915320 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.915143 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:59:33.915320 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.915164 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:59:33.915320 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.915174 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-772bs: references non-existent secret key: ca.crt Apr 16 19:59:33.915320 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:33.915238 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates podName:51461c0e-37ec-4c78-8fa7-2a135d82854d nodeName:}" failed. No retries permitted until 2026-04-16 19:59:34.915219884 +0000 UTC m=+328.190896440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates") pod "keda-operator-ffbb595cb-772bs" (UID: "51461c0e-37ec-4c78-8fa7-2a135d82854d") : references non-existent secret key: ca.crt Apr 16 19:59:34.923219 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:34.923186 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:34.923596 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:34.923285 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:59:34.923596 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:34.923304 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:59:34.923596 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:34.923313 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-772bs: references non-existent secret key: ca.crt Apr 16 19:59:34.923596 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:34.923362 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates podName:51461c0e-37ec-4c78-8fa7-2a135d82854d nodeName:}" failed. No retries permitted until 2026-04-16 19:59:36.923347778 +0000 UTC m=+330.199024335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates") pod "keda-operator-ffbb595cb-772bs" (UID: "51461c0e-37ec-4c78-8fa7-2a135d82854d") : references non-existent secret key: ca.crt Apr 16 19:59:36.937801 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:36.937756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:36.938191 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:36.937923 2567 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:59:36.938191 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:36.937942 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:59:36.938191 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:36.937952 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-772bs: references non-existent secret key: ca.crt Apr 16 19:59:36.938191 ip-10-0-136-138 kubenswrapper[2567]: E0416 19:59:36.938008 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates podName:51461c0e-37ec-4c78-8fa7-2a135d82854d nodeName:}" failed. No retries permitted until 2026-04-16 19:59:40.937993336 +0000 UTC m=+334.213669888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates") pod "keda-operator-ffbb595cb-772bs" (UID: "51461c0e-37ec-4c78-8fa7-2a135d82854d") : references non-existent secret key: ca.crt Apr 16 19:59:40.964183 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:40.964148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:40.966635 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:40.966614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/51461c0e-37ec-4c78-8fa7-2a135d82854d-certificates\") pod \"keda-operator-ffbb595cb-772bs\" (UID: \"51461c0e-37ec-4c78-8fa7-2a135d82854d\") " pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:41.033492 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:41.033459 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:41.153127 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:41.153101 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-772bs"] Apr 16 19:59:41.155915 ip-10-0-136-138 kubenswrapper[2567]: W0416 19:59:41.155886 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51461c0e_37ec_4c78_8fa7_2a135d82854d.slice/crio-167a531d21409e10c3ba5ede847698d85fda050a4a02bc4de653c77863c71202 WatchSource:0}: Error finding container 167a531d21409e10c3ba5ede847698d85fda050a4a02bc4de653c77863c71202: Status 404 returned error can't find the container with id 167a531d21409e10c3ba5ede847698d85fda050a4a02bc4de653c77863c71202 Apr 16 19:59:41.157542 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:41.157524 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:59:41.276383 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:41.276309 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-772bs" event={"ID":"51461c0e-37ec-4c78-8fa7-2a135d82854d","Type":"ContainerStarted","Data":"167a531d21409e10c3ba5ede847698d85fda050a4a02bc4de653c77863c71202"} Apr 16 19:59:45.289116 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:45.289080 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-772bs" event={"ID":"51461c0e-37ec-4c78-8fa7-2a135d82854d","Type":"ContainerStarted","Data":"0173cd2329e438f6914f2bc098643555e2be9333aa0245f4d1b9d58b616f177a"} Apr 16 19:59:45.289501 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:45.289219 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 19:59:45.310128 ip-10-0-136-138 kubenswrapper[2567]: I0416 19:59:45.310075 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-772bs" podStartSLOduration=8.961006933 podStartE2EDuration="12.310058932s" podCreationTimestamp="2026-04-16 19:59:33 +0000 UTC" firstStartedPulling="2026-04-16 19:59:41.157702552 +0000 UTC m=+334.433379108" lastFinishedPulling="2026-04-16 19:59:44.506754551 +0000 UTC m=+337.782431107" observedRunningTime="2026-04-16 19:59:45.309248448 +0000 UTC m=+338.584925039" watchObservedRunningTime="2026-04-16 19:59:45.310058932 +0000 UTC m=+338.585735507" Apr 16 20:00:06.293672 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:06.293641 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-772bs" Apr 16 20:00:42.034856 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.034810 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-bntkp"] Apr 16 20:00:42.043293 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.043268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.049757 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.049471 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:00:42.049757 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.049740 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:00:42.049939 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.049805 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 20:00:42.050070 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.050053 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-vfqqm\"" Apr 16 20:00:42.056869 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.056824 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-bntkp"] Apr 16 20:00:42.143046 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.143016 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-69sc9"] Apr 16 20:00:42.146002 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.145986 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.148242 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.148222 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:00:42.148687 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.148673 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-sshjd\"" Apr 16 20:00:42.163813 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.163793 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-69sc9"] Apr 16 20:00:42.190402 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.190377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-cert\") pod \"kserve-controller-manager-659c8cbdc-bntkp\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.190491 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.190426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6c5f\" (UniqueName: \"kubernetes.io/projected/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-kube-api-access-h6c5f\") pod \"kserve-controller-manager-659c8cbdc-bntkp\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.291469 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.291413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-cert\") pod \"kserve-controller-manager-659c8cbdc-bntkp\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.291469 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.291454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd6c3db7-29db-4303-8dc1-05f7d32ba7e3-data\") pod \"seaweedfs-86cc847c5c-69sc9\" (UID: \"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3\") " pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.291635 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.291482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x9q8\" (UniqueName: \"kubernetes.io/projected/cd6c3db7-29db-4303-8dc1-05f7d32ba7e3-kube-api-access-7x9q8\") pod \"seaweedfs-86cc847c5c-69sc9\" (UID: \"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3\") " pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.291635 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.291511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6c5f\" (UniqueName: \"kubernetes.io/projected/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-kube-api-access-h6c5f\") pod \"kserve-controller-manager-659c8cbdc-bntkp\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.293896 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.293879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-cert\") pod \"kserve-controller-manager-659c8cbdc-bntkp\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.310811 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.310782 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6c5f\" (UniqueName: \"kubernetes.io/projected/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-kube-api-access-h6c5f\") pod \"kserve-controller-manager-659c8cbdc-bntkp\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.353209 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.353186 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:42.391986 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.391952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd6c3db7-29db-4303-8dc1-05f7d32ba7e3-data\") pod \"seaweedfs-86cc847c5c-69sc9\" (UID: \"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3\") " pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.392131 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.392024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x9q8\" (UniqueName: \"kubernetes.io/projected/cd6c3db7-29db-4303-8dc1-05f7d32ba7e3-kube-api-access-7x9q8\") pod \"seaweedfs-86cc847c5c-69sc9\" (UID: \"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3\") " pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.392333 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.392310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd6c3db7-29db-4303-8dc1-05f7d32ba7e3-data\") pod \"seaweedfs-86cc847c5c-69sc9\" (UID: \"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3\") " pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.404184 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.404159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x9q8\" (UniqueName: \"kubernetes.io/projected/cd6c3db7-29db-4303-8dc1-05f7d32ba7e3-kube-api-access-7x9q8\") pod \"seaweedfs-86cc847c5c-69sc9\" (UID: \"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3\") " pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.454137 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.454109 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:42.482529 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.482501 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-bntkp"] Apr 16 20:00:42.485652 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:00:42.485625 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f87cf8_bb37_4331_b6d9_3ea7cd97a390.slice/crio-c1d6308178e5e54cb9ec994cce83c8e0c228c173fe2589d1b0d4185625378d0f WatchSource:0}: Error finding container c1d6308178e5e54cb9ec994cce83c8e0c228c173fe2589d1b0d4185625378d0f: Status 404 returned error can't find the container with id c1d6308178e5e54cb9ec994cce83c8e0c228c173fe2589d1b0d4185625378d0f Apr 16 20:00:42.574912 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:42.574891 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-69sc9"] Apr 16 20:00:42.576902 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:00:42.576880 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd6c3db7_29db_4303_8dc1_05f7d32ba7e3.slice/crio-a45842ff50b8422b4f79608263719efbc719fc6d28eb6c18fb802209cdc3c987 WatchSource:0}: Error finding container a45842ff50b8422b4f79608263719efbc719fc6d28eb6c18fb802209cdc3c987: Status 404 returned error can't find the container with id a45842ff50b8422b4f79608263719efbc719fc6d28eb6c18fb802209cdc3c987 Apr 16 20:00:43.431174 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:43.431111 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" event={"ID":"72f87cf8-bb37-4331-b6d9-3ea7cd97a390","Type":"ContainerStarted","Data":"c1d6308178e5e54cb9ec994cce83c8e0c228c173fe2589d1b0d4185625378d0f"} Apr 16 20:00:43.433074 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:43.433045 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-69sc9" event={"ID":"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3","Type":"ContainerStarted","Data":"a45842ff50b8422b4f79608263719efbc719fc6d28eb6c18fb802209cdc3c987"} Apr 16 20:00:46.443316 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:46.443283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-69sc9" event={"ID":"cd6c3db7-29db-4303-8dc1-05f7d32ba7e3","Type":"ContainerStarted","Data":"715545cd2fada110d211594abbc43914cc58f780307aae1c69f4474cabff4b41"} Apr 16 20:00:46.443758 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:46.443393 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:46.444565 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:46.444543 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" event={"ID":"72f87cf8-bb37-4331-b6d9-3ea7cd97a390","Type":"ContainerStarted","Data":"201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca"} Apr 16 20:00:46.444686 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:46.444671 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:00:46.473614 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:46.473567 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-69sc9" podStartSLOduration=0.952819421 podStartE2EDuration="4.473554986s" podCreationTimestamp="2026-04-16 20:00:42 +0000 UTC" firstStartedPulling="2026-04-16 20:00:42.578198688 +0000 UTC m=+395.853875241" lastFinishedPulling="2026-04-16 20:00:46.098934247 +0000 UTC m=+399.374610806" observedRunningTime="2026-04-16 20:00:46.472035163 +0000 UTC m=+399.747711737" watchObservedRunningTime="2026-04-16 20:00:46.473554986 +0000 UTC m=+399.749231560" Apr 16 20:00:52.449368 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:52.449333 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-69sc9" Apr 16 20:00:52.487934 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:00:52.487886 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" podStartSLOduration=7.9699513379999996 podStartE2EDuration="11.487872723s" podCreationTimestamp="2026-04-16 20:00:41 +0000 UTC" firstStartedPulling="2026-04-16 20:00:42.487475592 +0000 UTC m=+395.763152146" lastFinishedPulling="2026-04-16 20:00:46.005396966 +0000 UTC m=+399.281073531" observedRunningTime="2026-04-16 20:00:46.504514838 +0000 UTC m=+399.780191412" watchObservedRunningTime="2026-04-16 20:00:52.487872723 +0000 UTC m=+405.763549297" Apr 16 20:01:17.452194 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:17.452122 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:01:21.032318 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.032285 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-bntkp"] Apr 16 20:01:21.032676 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.032502 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" podUID="72f87cf8-bb37-4331-b6d9-3ea7cd97a390" containerName="manager" containerID="cri-o://201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca" gracePeriod=10 Apr 16 20:01:21.057613 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.057587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-shq9p"] Apr 16 20:01:21.060311 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.060295 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.071521 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.071493 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-shq9p"] Apr 16 20:01:21.162216 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.162191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fpj\" (UniqueName: \"kubernetes.io/projected/3046784b-2eb5-46e2-b95e-56d76d5679a3-kube-api-access-h7fpj\") pod \"kserve-controller-manager-659c8cbdc-shq9p\" (UID: \"3046784b-2eb5-46e2-b95e-56d76d5679a3\") " pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.162331 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.162271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3046784b-2eb5-46e2-b95e-56d76d5679a3-cert\") pod \"kserve-controller-manager-659c8cbdc-shq9p\" (UID: \"3046784b-2eb5-46e2-b95e-56d76d5679a3\") " pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.263348 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.263317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3046784b-2eb5-46e2-b95e-56d76d5679a3-cert\") pod \"kserve-controller-manager-659c8cbdc-shq9p\" (UID: \"3046784b-2eb5-46e2-b95e-56d76d5679a3\") " pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.263452 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.263361 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fpj\" (UniqueName: \"kubernetes.io/projected/3046784b-2eb5-46e2-b95e-56d76d5679a3-kube-api-access-h7fpj\") pod \"kserve-controller-manager-659c8cbdc-shq9p\" (UID: \"3046784b-2eb5-46e2-b95e-56d76d5679a3\") " pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.265928 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.265904 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3046784b-2eb5-46e2-b95e-56d76d5679a3-cert\") pod \"kserve-controller-manager-659c8cbdc-shq9p\" (UID: \"3046784b-2eb5-46e2-b95e-56d76d5679a3\") " pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.271546 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.271523 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fpj\" (UniqueName: \"kubernetes.io/projected/3046784b-2eb5-46e2-b95e-56d76d5679a3-kube-api-access-h7fpj\") pod \"kserve-controller-manager-659c8cbdc-shq9p\" (UID: \"3046784b-2eb5-46e2-b95e-56d76d5679a3\") " pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.272950 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.272932 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:01:21.364558 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.364463 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-cert\") pod \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " Apr 16 20:01:21.364558 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.364522 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6c5f\" (UniqueName: \"kubernetes.io/projected/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-kube-api-access-h6c5f\") pod \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\" (UID: \"72f87cf8-bb37-4331-b6d9-3ea7cd97a390\") " Apr 16 20:01:21.366768 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.366742 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-kube-api-access-h6c5f" (OuterVolumeSpecName: "kube-api-access-h6c5f") pod "72f87cf8-bb37-4331-b6d9-3ea7cd97a390" (UID: "72f87cf8-bb37-4331-b6d9-3ea7cd97a390"). InnerVolumeSpecName "kube-api-access-h6c5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:01:21.366866 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.366777 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-cert" (OuterVolumeSpecName: "cert") pod "72f87cf8-bb37-4331-b6d9-3ea7cd97a390" (UID: "72f87cf8-bb37-4331-b6d9-3ea7cd97a390"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:01:21.403961 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.403931 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:21.465873 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.465818 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6c5f\" (UniqueName: \"kubernetes.io/projected/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-kube-api-access-h6c5f\") on node \"ip-10-0-136-138.ec2.internal\" DevicePath \"\"" Apr 16 20:01:21.465873 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.465874 2567 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f87cf8-bb37-4331-b6d9-3ea7cd97a390-cert\") on node \"ip-10-0-136-138.ec2.internal\" DevicePath \"\"" Apr 16 20:01:21.520244 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.520216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-shq9p"] Apr 16 20:01:21.523724 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:01:21.523696 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3046784b_2eb5_46e2_b95e_56d76d5679a3.slice/crio-c36bb6b66363289bbc9f6584b3b7abd500ac9db0ee255ea30284a33e7c095299 WatchSource:0}: Error finding container c36bb6b66363289bbc9f6584b3b7abd500ac9db0ee255ea30284a33e7c095299: Status 404 returned error can't find the container with id c36bb6b66363289bbc9f6584b3b7abd500ac9db0ee255ea30284a33e7c095299 Apr 16 20:01:21.531043 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.531011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" event={"ID":"3046784b-2eb5-46e2-b95e-56d76d5679a3","Type":"ContainerStarted","Data":"c36bb6b66363289bbc9f6584b3b7abd500ac9db0ee255ea30284a33e7c095299"} Apr 16 20:01:21.532462 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.532440 2567 generic.go:358] "Generic (PLEG): container finished" podID="72f87cf8-bb37-4331-b6d9-3ea7cd97a390" containerID="201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca" exitCode=0 Apr 16 20:01:21.532575 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.532493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" event={"ID":"72f87cf8-bb37-4331-b6d9-3ea7cd97a390","Type":"ContainerDied","Data":"201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca"} Apr 16 20:01:21.532575 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.532514 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" Apr 16 20:01:21.532575 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.532524 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-bntkp" event={"ID":"72f87cf8-bb37-4331-b6d9-3ea7cd97a390","Type":"ContainerDied","Data":"c1d6308178e5e54cb9ec994cce83c8e0c228c173fe2589d1b0d4185625378d0f"} Apr 16 20:01:21.532575 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.532543 2567 scope.go:117] "RemoveContainer" containerID="201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca" Apr 16 20:01:21.539992 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.539975 2567 scope.go:117] "RemoveContainer" containerID="201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca" Apr 16 20:01:21.540290 ip-10-0-136-138 kubenswrapper[2567]: E0416 20:01:21.540265 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca\": container with ID starting with 201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca not found: ID does not exist" containerID="201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca" Apr 16 20:01:21.540370 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.540302 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca"} err="failed to get container status \"201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca\": rpc error: code = NotFound desc = could not find container \"201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca\": container with ID starting with 201ad7c58e59af899b99d83aae0fe1da33765f5e5605f3768af78ed6842dadca not found: ID does not exist" Apr 16 20:01:21.551689 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.551666 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-bntkp"] Apr 16 20:01:21.554582 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:21.554561 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-bntkp"] Apr 16 20:01:22.536653 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:22.536618 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" event={"ID":"3046784b-2eb5-46e2-b95e-56d76d5679a3","Type":"ContainerStarted","Data":"750e833185b1660039255601ad931547b61908b72e57f18caeb05ef87b40b3b4"} Apr 16 20:01:22.537123 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:22.536678 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:23.326389 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:23.326357 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f87cf8-bb37-4331-b6d9-3ea7cd97a390" path="/var/lib/kubelet/pods/72f87cf8-bb37-4331-b6d9-3ea7cd97a390/volumes" Apr 16 20:01:53.545094 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:53.545058 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" Apr 16 20:01:53.561883 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:53.561821 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-shq9p" podStartSLOduration=32.064291662 podStartE2EDuration="32.561807141s" podCreationTimestamp="2026-04-16 20:01:21 +0000 UTC" firstStartedPulling="2026-04-16 20:01:21.524924134 +0000 UTC m=+434.800600688" lastFinishedPulling="2026-04-16 20:01:22.022439611 +0000 UTC m=+435.298116167" observedRunningTime="2026-04-16 20:01:22.555475363 +0000 UTC m=+435.831151940" watchObservedRunningTime="2026-04-16 20:01:53.561807141 +0000 UTC m=+466.837483732" Apr 16 20:01:54.447696 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.447662 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-75v8g"] Apr 16 20:01:54.447979 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.447965 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72f87cf8-bb37-4331-b6d9-3ea7cd97a390" containerName="manager" Apr 16 20:01:54.448030 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.447982 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f87cf8-bb37-4331-b6d9-3ea7cd97a390" containerName="manager" Apr 16 20:01:54.448064 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.448046 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="72f87cf8-bb37-4331-b6d9-3ea7cd97a390" containerName="manager" Apr 16 20:01:54.450996 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.450977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.453321 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.453304 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:01:54.453424 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.453308 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-97s5m\"" Apr 16 20:01:54.465930 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.465907 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-75v8g"] Apr 16 20:01:54.469004 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.468984 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7snmp"] Apr 16 20:01:54.471848 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.471818 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.474078 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.474060 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:01:54.474263 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.474250 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-kwqbl\"" Apr 16 20:01:54.481474 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.481455 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7snmp"] Apr 16 20:01:54.599550 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.599516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-899gq\" (UniqueName: \"kubernetes.io/projected/2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f-kube-api-access-899gq\") pod \"odh-model-controller-696fc77849-7snmp\" (UID: \"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f\") " pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.599931 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.599559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a26cc242-5e43-4902-bc3f-dcb0918a4e4b-tls-certs\") pod \"model-serving-api-86f7b4b499-75v8g\" (UID: \"a26cc242-5e43-4902-bc3f-dcb0918a4e4b\") " pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.599931 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.599589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rsb\" (UniqueName: \"kubernetes.io/projected/a26cc242-5e43-4902-bc3f-dcb0918a4e4b-kube-api-access-n5rsb\") pod \"model-serving-api-86f7b4b499-75v8g\" (UID: \"a26cc242-5e43-4902-bc3f-dcb0918a4e4b\") " pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.599931 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.599608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f-cert\") pod \"odh-model-controller-696fc77849-7snmp\" (UID: \"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f\") " pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.700974 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.700892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-899gq\" (UniqueName: \"kubernetes.io/projected/2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f-kube-api-access-899gq\") pod \"odh-model-controller-696fc77849-7snmp\" (UID: \"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f\") " pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.700974 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.700953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a26cc242-5e43-4902-bc3f-dcb0918a4e4b-tls-certs\") pod \"model-serving-api-86f7b4b499-75v8g\" (UID: \"a26cc242-5e43-4902-bc3f-dcb0918a4e4b\") " pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.701193 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.700997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rsb\" (UniqueName: \"kubernetes.io/projected/a26cc242-5e43-4902-bc3f-dcb0918a4e4b-kube-api-access-n5rsb\") pod \"model-serving-api-86f7b4b499-75v8g\" (UID: \"a26cc242-5e43-4902-bc3f-dcb0918a4e4b\") " pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.701193 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.701022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f-cert\") pod \"odh-model-controller-696fc77849-7snmp\" (UID: \"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f\") " pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.703590 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.703554 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f-cert\") pod \"odh-model-controller-696fc77849-7snmp\" (UID: \"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f\") " pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.703698 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.703559 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a26cc242-5e43-4902-bc3f-dcb0918a4e4b-tls-certs\") pod \"model-serving-api-86f7b4b499-75v8g\" (UID: \"a26cc242-5e43-4902-bc3f-dcb0918a4e4b\") " pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.714314 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.714283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rsb\" (UniqueName: \"kubernetes.io/projected/a26cc242-5e43-4902-bc3f-dcb0918a4e4b-kube-api-access-n5rsb\") pod \"model-serving-api-86f7b4b499-75v8g\" (UID: \"a26cc242-5e43-4902-bc3f-dcb0918a4e4b\") " pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.714419 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.714296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-899gq\" (UniqueName: \"kubernetes.io/projected/2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f-kube-api-access-899gq\") pod \"odh-model-controller-696fc77849-7snmp\" (UID: \"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f\") " pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.760945 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.760922 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:54.781585 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.781563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:54.891916 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.891890 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-75v8g"] Apr 16 20:01:54.894442 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:01:54.894406 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26cc242_5e43_4902_bc3f_dcb0918a4e4b.slice/crio-5408afa48cbbcf83280774ee641f817732c5055deae3f830e97517f76f742d1e WatchSource:0}: Error finding container 5408afa48cbbcf83280774ee641f817732c5055deae3f830e97517f76f742d1e: Status 404 returned error can't find the container with id 5408afa48cbbcf83280774ee641f817732c5055deae3f830e97517f76f742d1e Apr 16 20:01:54.907645 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:54.907619 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7snmp"] Apr 16 20:01:54.910303 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:01:54.910282 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0f2bb7_9f63_42f8_8b60_ce5bc64bfe7f.slice/crio-e210724ff79637f211556bdaa892d9901ad1d4792342994e5133a45597310d6d WatchSource:0}: Error finding container e210724ff79637f211556bdaa892d9901ad1d4792342994e5133a45597310d6d: Status 404 returned error can't find the container with id e210724ff79637f211556bdaa892d9901ad1d4792342994e5133a45597310d6d Apr 16 20:01:55.624952 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:55.624911 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7snmp" event={"ID":"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f","Type":"ContainerStarted","Data":"e210724ff79637f211556bdaa892d9901ad1d4792342994e5133a45597310d6d"} Apr 16 20:01:55.626756 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:55.626726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-75v8g" event={"ID":"a26cc242-5e43-4902-bc3f-dcb0918a4e4b","Type":"ContainerStarted","Data":"5408afa48cbbcf83280774ee641f817732c5055deae3f830e97517f76f742d1e"} Apr 16 20:01:58.638045 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:58.638009 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7snmp" event={"ID":"2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f","Type":"ContainerStarted","Data":"2b53415d144a5b278c7f277b1c660d341bbdb74b030dd5a8ef8ac63d6f1db671"} Apr 16 20:01:58.638535 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:58.638119 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:01:58.639364 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:58.639341 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-75v8g" event={"ID":"a26cc242-5e43-4902-bc3f-dcb0918a4e4b","Type":"ContainerStarted","Data":"6a79c133f2e79ac12ab376df33808ae07283acdb7983744b42f1e737e9af4ae9"} Apr 16 20:01:58.639465 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:58.639447 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:01:58.654016 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:58.653973 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7snmp" podStartSLOduration=1.983803724 podStartE2EDuration="4.65395992s" podCreationTimestamp="2026-04-16 20:01:54 +0000 UTC" firstStartedPulling="2026-04-16 20:01:54.911389271 +0000 UTC m=+468.187065824" lastFinishedPulling="2026-04-16 20:01:57.581545467 +0000 UTC m=+470.857222020" observedRunningTime="2026-04-16 20:01:58.652580872 +0000 UTC m=+471.928257441" watchObservedRunningTime="2026-04-16 20:01:58.65395992 +0000 UTC m=+471.929636494" Apr 16 20:01:58.669675 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:01:58.669633 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-75v8g" podStartSLOduration=2.032120472 podStartE2EDuration="4.669620711s" podCreationTimestamp="2026-04-16 20:01:54 +0000 UTC" firstStartedPulling="2026-04-16 20:01:54.896259858 +0000 UTC m=+468.171936411" lastFinishedPulling="2026-04-16 20:01:57.533760093 +0000 UTC m=+470.809436650" observedRunningTime="2026-04-16 20:01:58.668426206 +0000 UTC m=+471.944102781" watchObservedRunningTime="2026-04-16 20:01:58.669620711 +0000 UTC m=+471.945297286" Apr 16 20:02:09.644949 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:02:09.644918 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7snmp" Apr 16 20:02:09.646797 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:02:09.646776 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-75v8g" Apr 16 20:04:07.233529 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:04:07.233502 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:04:07.235000 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:04:07.234982 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:07:26.004284 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.004210 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t"] Apr 16 20:07:26.006150 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.006134 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.008507 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.008481 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-78192-kube-rbac-proxy-sar-config\"" Apr 16 20:07:26.008634 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.008612 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-g6xwp\"" Apr 16 20:07:26.008706 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.008611 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:07:26.009084 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.009069 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-78192-serving-cert\"" Apr 16 20:07:26.016157 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.016134 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t"] Apr 16 20:07:26.083687 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.083659 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98c6154b-5b56-499a-a050-3300479f7193-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-78192-7f674fdfb9-dx64t\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.083820 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.083717 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c6154b-5b56-499a-a050-3300479f7193-proxy-tls\") pod \"model-chainer-raw-hpa-78192-7f674fdfb9-dx64t\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.184506 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.184469 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c6154b-5b56-499a-a050-3300479f7193-proxy-tls\") pod \"model-chainer-raw-hpa-78192-7f674fdfb9-dx64t\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.184663 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.184523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98c6154b-5b56-499a-a050-3300479f7193-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-78192-7f674fdfb9-dx64t\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.185204 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.185181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98c6154b-5b56-499a-a050-3300479f7193-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-78192-7f674fdfb9-dx64t\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.187020 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.186997 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c6154b-5b56-499a-a050-3300479f7193-proxy-tls\") pod \"model-chainer-raw-hpa-78192-7f674fdfb9-dx64t\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.316861 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.316760 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:26.435445 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.435415 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t"] Apr 16 20:07:26.438544 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:07:26.438507 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c6154b_5b56_499a_a050_3300479f7193.slice/crio-587121e0b659dd5b830628c132cc8cfd765518e26538fcf4d5a78c4adf6aa2b5 WatchSource:0}: Error finding container 587121e0b659dd5b830628c132cc8cfd765518e26538fcf4d5a78c4adf6aa2b5: Status 404 returned error can't find the container with id 587121e0b659dd5b830628c132cc8cfd765518e26538fcf4d5a78c4adf6aa2b5 Apr 16 20:07:26.440802 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.440783 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:07:26.497310 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:26.497281 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" event={"ID":"98c6154b-5b56-499a-a050-3300479f7193","Type":"ContainerStarted","Data":"587121e0b659dd5b830628c132cc8cfd765518e26538fcf4d5a78c4adf6aa2b5"} Apr 16 20:07:29.505454 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:29.505417 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" event={"ID":"98c6154b-5b56-499a-a050-3300479f7193","Type":"ContainerStarted","Data":"fffa7a8d2bab47b2238e5b2c070f9c3a311d2918cb3172e19b7ab9c684d77388"} Apr 16 20:07:29.505895 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:29.505645 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:29.521794 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:29.521742 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podStartSLOduration=2.14945144 podStartE2EDuration="4.521724238s" podCreationTimestamp="2026-04-16 20:07:25 +0000 UTC" firstStartedPulling="2026-04-16 20:07:26.440958725 +0000 UTC m=+799.716635278" lastFinishedPulling="2026-04-16 20:07:28.813231523 +0000 UTC m=+802.088908076" observedRunningTime="2026-04-16 20:07:29.520523489 +0000 UTC m=+802.796200065" watchObservedRunningTime="2026-04-16 20:07:29.521724238 +0000 UTC m=+802.797400815" Apr 16 20:07:35.515034 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:35.515002 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:36.062260 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.062220 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t"] Apr 16 20:07:36.062569 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.062522 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" containerID="cri-o://fffa7a8d2bab47b2238e5b2c070f9c3a311d2918cb3172e19b7ab9c684d77388" gracePeriod=30 Apr 16 20:07:36.231365 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.231330 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv"] Apr 16 20:07:36.233673 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.233655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" Apr 16 20:07:36.241363 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.241337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv"] Apr 16 20:07:36.243689 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.243670 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" Apr 16 20:07:36.374416 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.374384 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv"] Apr 16 20:07:36.377630 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:07:36.377584 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99369020_73f0_4c96_962b_53b691c705c6.slice/crio-58eb8d3cd60b8716de209c9f18871518bcb9301e60062ed681fd7eb311745aa5 WatchSource:0}: Error finding container 58eb8d3cd60b8716de209c9f18871518bcb9301e60062ed681fd7eb311745aa5: Status 404 returned error can't find the container with id 58eb8d3cd60b8716de209c9f18871518bcb9301e60062ed681fd7eb311745aa5 Apr 16 20:07:36.523066 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:36.523034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" event={"ID":"99369020-73f0-4c96-962b-53b691c705c6","Type":"ContainerStarted","Data":"58eb8d3cd60b8716de209c9f18871518bcb9301e60062ed681fd7eb311745aa5"} Apr 16 20:07:37.527661 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:37.527631 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" event={"ID":"99369020-73f0-4c96-962b-53b691c705c6","Type":"ContainerStarted","Data":"183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827"} Apr 16 20:07:37.528011 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:37.527819 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" Apr 16 20:07:37.529060 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:37.529034 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" podUID="99369020-73f0-4c96-962b-53b691c705c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 20:07:37.540896 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:37.540830 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" podStartSLOduration=0.47978272 podStartE2EDuration="1.540816701s" podCreationTimestamp="2026-04-16 20:07:36 +0000 UTC" firstStartedPulling="2026-04-16 20:07:36.379311293 +0000 UTC m=+809.654987847" lastFinishedPulling="2026-04-16 20:07:37.440345272 +0000 UTC m=+810.716021828" observedRunningTime="2026-04-16 20:07:37.540531791 +0000 UTC m=+810.816208389" watchObservedRunningTime="2026-04-16 20:07:37.540816701 +0000 UTC m=+810.816493277" Apr 16 20:07:38.530932 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:38.530906 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" Apr 16 20:07:40.511911 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:40.511866 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:07:45.511990 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:45.511949 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:07:50.512120 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:50.512080 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:07:50.512575 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:50.512173 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:07:55.511653 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:07:55.511613 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:00.512576 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:00.512531 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:05.512885 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:05.512824 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:08:06.608627 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.608593 2567 generic.go:358] "Generic (PLEG): container finished" podID="98c6154b-5b56-499a-a050-3300479f7193" containerID="fffa7a8d2bab47b2238e5b2c070f9c3a311d2918cb3172e19b7ab9c684d77388" exitCode=0 Apr 16 20:08:06.608972 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.608654 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" event={"ID":"98c6154b-5b56-499a-a050-3300479f7193","Type":"ContainerDied","Data":"fffa7a8d2bab47b2238e5b2c070f9c3a311d2918cb3172e19b7ab9c684d77388"} Apr 16 20:08:06.710956 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.710933 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:08:06.868979 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.868876 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c6154b-5b56-499a-a050-3300479f7193-proxy-tls\") pod \"98c6154b-5b56-499a-a050-3300479f7193\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " Apr 16 20:08:06.868979 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.868965 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98c6154b-5b56-499a-a050-3300479f7193-openshift-service-ca-bundle\") pod \"98c6154b-5b56-499a-a050-3300479f7193\" (UID: \"98c6154b-5b56-499a-a050-3300479f7193\") " Apr 16 20:08:06.869344 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.869315 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6154b-5b56-499a-a050-3300479f7193-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "98c6154b-5b56-499a-a050-3300479f7193" (UID: "98c6154b-5b56-499a-a050-3300479f7193"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:08:06.871255 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.871227 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c6154b-5b56-499a-a050-3300479f7193-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "98c6154b-5b56-499a-a050-3300479f7193" (UID: "98c6154b-5b56-499a-a050-3300479f7193"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:08:06.969547 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.969511 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98c6154b-5b56-499a-a050-3300479f7193-proxy-tls\") on node \"ip-10-0-136-138.ec2.internal\" DevicePath \"\"" Apr 16 20:08:06.969547 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:06.969543 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98c6154b-5b56-499a-a050-3300479f7193-openshift-service-ca-bundle\") on node \"ip-10-0-136-138.ec2.internal\" DevicePath \"\"" Apr 16 20:08:07.612246 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:07.612179 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" Apr 16 20:08:07.612623 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:07.612179 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t" event={"ID":"98c6154b-5b56-499a-a050-3300479f7193","Type":"ContainerDied","Data":"587121e0b659dd5b830628c132cc8cfd765518e26538fcf4d5a78c4adf6aa2b5"} Apr 16 20:08:07.612623 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:07.612299 2567 scope.go:117] "RemoveContainer" containerID="fffa7a8d2bab47b2238e5b2c070f9c3a311d2918cb3172e19b7ab9c684d77388" Apr 16 20:08:07.636262 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:07.636235 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t"] Apr 16 20:08:07.643348 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:07.643328 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-78192-7f674fdfb9-dx64t"] Apr 16 20:08:09.326235 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:08:09.326200 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c6154b-5b56-499a-a050-3300479f7193" path="/var/lib/kubelet/pods/98c6154b-5b56-499a-a050-3300479f7193/volumes" Apr 16 20:09:07.250201 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:07.250175 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:09:07.251966 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:07.251946 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:09:11.355663 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.355623 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-428b1-predictor-795678bc55-ssbzv_99369020-73f0-4c96-962b-53b691c705c6/kserve-container/0.log" Apr 16 20:09:11.506484 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.506451 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv"] Apr 16 20:09:11.506825 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.506779 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" podUID="99369020-73f0-4c96-962b-53b691c705c6" containerName="kserve-container" containerID="cri-o://183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827" gracePeriod=30 Apr 16 20:09:11.751663 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.751639 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" Apr 16 20:09:11.785492 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.785461 2567 generic.go:358] "Generic (PLEG): container finished" podID="99369020-73f0-4c96-962b-53b691c705c6" containerID="183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827" exitCode=2 Apr 16 20:09:11.785628 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.785520 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" Apr 16 20:09:11.785628 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.785522 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" event={"ID":"99369020-73f0-4c96-962b-53b691c705c6","Type":"ContainerDied","Data":"183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827"} Apr 16 20:09:11.785628 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.785622 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv" event={"ID":"99369020-73f0-4c96-962b-53b691c705c6","Type":"ContainerDied","Data":"58eb8d3cd60b8716de209c9f18871518bcb9301e60062ed681fd7eb311745aa5"} Apr 16 20:09:11.785753 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.785638 2567 scope.go:117] "RemoveContainer" containerID="183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827" Apr 16 20:09:11.793574 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.793548 2567 scope.go:117] "RemoveContainer" containerID="183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827" Apr 16 20:09:11.793799 ip-10-0-136-138 kubenswrapper[2567]: E0416 20:09:11.793779 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827\": container with ID starting with 183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827 not found: ID does not exist" containerID="183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827" Apr 16 20:09:11.793880 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.793806 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827"} err="failed to get container status \"183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827\": rpc error: code = NotFound desc = could not find container \"183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827\": container with ID starting with 183eddf9b4b3f2c676a946f830f637013db7298a8e47a270be098dc638ef2827 not found: ID does not exist" Apr 16 20:09:11.804561 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.804540 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv"] Apr 16 20:09:11.808079 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:11.808059 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-428b1-predictor-795678bc55-ssbzv"] Apr 16 20:09:13.326197 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:09:13.326163 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99369020-73f0-4c96-962b-53b691c705c6" path="/var/lib/kubelet/pods/99369020-73f0-4c96-962b-53b691c705c6/volumes" Apr 16 20:14:07.267216 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:14:07.267187 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:14:07.270291 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:14:07.270265 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:16:40.245288 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:40.245256 2567 ???:1] "http: TLS handshake error from 10.0.139.205:56992: EOF" Apr 16 20:16:40.249398 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:40.249375 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q842n_28d3fadc-dade-46f9-9279-3080298ec06b/global-pull-secret-syncer/0.log" Apr 16 20:16:40.411669 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:40.411631 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lr7sw_e13fe6ed-e68d-4328-9562-990f76414842/konnectivity-agent/0.log" Apr 16 20:16:40.484011 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:40.483983 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-138.ec2.internal_21c919eac4810510b21920a294dfc127/haproxy/0.log" Apr 16 20:16:44.256868 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:44.256820 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5569c8f6b5-h52pg_2deeec4c-a2b4-4d70-9324-1b64013cf1c6/metrics-server/0.log" Apr 16 20:16:44.397894 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:44.397863 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9ghpw_5e98c2cc-e4f6-4adc-a067-f32e188531f9/node-exporter/0.log" Apr 16 20:16:44.420383 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:44.420360 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9ghpw_5e98c2cc-e4f6-4adc-a067-f32e188531f9/kube-rbac-proxy/0.log" Apr 16 20:16:44.442482 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:44.442457 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9ghpw_5e98c2cc-e4f6-4adc-a067-f32e188531f9/init-textfile/0.log" Apr 16 20:16:47.568667 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568633 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4"] Apr 16 20:16:47.569167 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568890 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" Apr 16 20:16:47.569167 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568900 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" Apr 16 20:16:47.569167 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568908 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99369020-73f0-4c96-962b-53b691c705c6" containerName="kserve-container" Apr 16 20:16:47.569167 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568913 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="99369020-73f0-4c96-962b-53b691c705c6" containerName="kserve-container" Apr 16 20:16:47.569167 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568966 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="98c6154b-5b56-499a-a050-3300479f7193" containerName="model-chainer-raw-hpa-78192" Apr 16 20:16:47.569167 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.568972 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="99369020-73f0-4c96-962b-53b691c705c6" containerName="kserve-container" Apr 16 20:16:47.571759 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.571738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.573784 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.573765 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9spp\"/\"openshift-service-ca.crt\"" Apr 16 20:16:47.573911 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.573806 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9spp\"/\"kube-root-ca.crt\"" Apr 16 20:16:47.574294 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.574278 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h9spp\"/\"default-dockercfg-lq8qc\"" Apr 16 20:16:47.580711 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.580688 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4"] Apr 16 20:16:47.712209 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.712180 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-proc\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.712344 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.712215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-podres\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.712344 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.712244 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-sys\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.712344 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.712282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-lib-modules\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.712344 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.712324 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vk9\" (UniqueName: \"kubernetes.io/projected/af313e63-e712-4d86-8c05-6d08622d2224-kube-api-access-78vk9\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.812919 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.812877 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-proc\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813075 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.812931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-podres\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813075 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.812975 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-sys\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813075 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.812996 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-proc\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813075 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.813000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-lib-modules\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813075 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.813044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78vk9\" (UniqueName: \"kubernetes.io/projected/af313e63-e712-4d86-8c05-6d08622d2224-kube-api-access-78vk9\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813075 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.813071 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-sys\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813305 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.813105 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-lib-modules\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.813305 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.813108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af313e63-e712-4d86-8c05-6d08622d2224-podres\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.820495 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.820445 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vk9\" (UniqueName: \"kubernetes.io/projected/af313e63-e712-4d86-8c05-6d08622d2224-kube-api-access-78vk9\") pod \"perf-node-gather-daemonset-sxfg4\" (UID: \"af313e63-e712-4d86-8c05-6d08622d2224\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.881906 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.881882 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:47.997414 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:47.997386 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4"] Apr 16 20:16:48.000492 ip-10-0-136-138 kubenswrapper[2567]: W0416 20:16:48.000465 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf313e63_e712_4d86_8c05_6d08622d2224.slice/crio-5e60c9f69a2fa2a8cda5a54a31843ead07485178f5d38c7adc46dc5664c524ec WatchSource:0}: Error finding container 5e60c9f69a2fa2a8cda5a54a31843ead07485178f5d38c7adc46dc5664c524ec: Status 404 returned error can't find the container with id 5e60c9f69a2fa2a8cda5a54a31843ead07485178f5d38c7adc46dc5664c524ec Apr 16 20:16:48.002079 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.002062 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:16:48.432148 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.432122 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9h7rg_83b13a4d-812f-4c1c-b1cd-cb6c294c587f/dns/0.log" Apr 16 20:16:48.451255 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.451215 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9h7rg_83b13a4d-812f-4c1c-b1cd-cb6c294c587f/kube-rbac-proxy/0.log" Apr 16 20:16:48.542512 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.542490 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w8m7p_24f35a04-1a02-4c6d-86fb-0b68fcd8fbec/dns-node-resolver/0.log" Apr 16 20:16:48.988061 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.988030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" event={"ID":"af313e63-e712-4d86-8c05-6d08622d2224","Type":"ContainerStarted","Data":"05cffbb96bc4c1188aeaa17bd6ae589dd805012f233c391b3c4fa16648284c68"} Apr 16 20:16:48.988061 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.988069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" event={"ID":"af313e63-e712-4d86-8c05-6d08622d2224","Type":"ContainerStarted","Data":"5e60c9f69a2fa2a8cda5a54a31843ead07485178f5d38c7adc46dc5664c524ec"} Apr 16 20:16:48.988450 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:48.988163 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:49.004652 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:49.004612 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" podStartSLOduration=2.004599709 podStartE2EDuration="2.004599709s" podCreationTimestamp="2026-04-16 20:16:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:49.003053506 +0000 UTC m=+1362.278730080" watchObservedRunningTime="2026-04-16 20:16:49.004599709 +0000 UTC m=+1362.280276321" Apr 16 20:16:49.055896 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:49.055872 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xz5br_5f32c181-e6f9-4aa8-b370-e213007636e9/node-ca/0.log" Apr 16 20:16:50.085932 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:50.085903 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ngw62_f7c3453a-e123-4e53-b238-f0fd985f362c/serve-healthcheck-canary/0.log" Apr 16 20:16:50.463537 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:50.463505 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gsnhj_c562fdae-037e-4053-be61-6f0b8eb48c63/kube-rbac-proxy/0.log" Apr 16 20:16:50.485505 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:50.485480 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gsnhj_c562fdae-037e-4053-be61-6f0b8eb48c63/exporter/0.log" Apr 16 20:16:50.505240 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:50.505211 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gsnhj_c562fdae-037e-4053-be61-6f0b8eb48c63/extractor/0.log" Apr 16 20:16:52.553720 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:52.553687 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-659c8cbdc-shq9p_3046784b-2eb5-46e2-b95e-56d76d5679a3/manager/0.log" Apr 16 20:16:52.598515 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:52.598493 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-75v8g_a26cc242-5e43-4902-bc3f-dcb0918a4e4b/server/0.log" Apr 16 20:16:52.674438 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:52.674412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7snmp_2d0f2bb7-9f63-42f8-8b60-ce5bc64bfe7f/manager/0.log" Apr 16 20:16:52.718704 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:52.718682 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-69sc9_cd6c3db7-29db-4303-8dc1-05f7d32ba7e3/seaweedfs/0.log" Apr 16 20:16:55.000522 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:55.000492 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-sxfg4" Apr 16 20:16:58.162219 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.162189 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/kube-multus-additional-cni-plugins/0.log" Apr 16 20:16:58.183387 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.183359 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/egress-router-binary-copy/0.log" Apr 16 20:16:58.204304 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.204278 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/cni-plugins/0.log" Apr 16 20:16:58.232587 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.232565 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/bond-cni-plugin/0.log" Apr 16 20:16:58.253152 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.253135 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/routeoverride-cni/0.log" Apr 16 20:16:58.272999 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.272978 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/whereabouts-cni-bincopy/0.log" Apr 16 20:16:58.292803 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.292779 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f7hpw_3753de4d-d4c5-4f6d-a1a3-bb06177d48f5/whereabouts-cni/0.log" Apr 16 20:16:58.520706 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.520675 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zh765_1a8e9f53-a312-4d09-93df-0fd1a68610ff/kube-multus/0.log" Apr 16 20:16:58.636496 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.636466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x5ml5_f4789667-3ad6-413b-9c9e-a072e7b79d5d/network-metrics-daemon/0.log" Apr 16 20:16:58.656132 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:58.656108 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x5ml5_f4789667-3ad6-413b-9c9e-a072e7b79d5d/kube-rbac-proxy/0.log" Apr 16 20:16:59.789450 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.789420 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-controller/0.log" Apr 16 20:16:59.807549 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.807525 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/0.log" Apr 16 20:16:59.813649 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.813630 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovn-acl-logging/1.log" Apr 16 20:16:59.832049 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.832028 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/kube-rbac-proxy-node/0.log" Apr 16 20:16:59.853276 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.853257 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:16:59.872708 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.872679 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/northd/0.log" Apr 16 20:16:59.892149 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.892131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/nbdb/0.log" Apr 16 20:16:59.912378 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:16:59.912354 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/sbdb/0.log" Apr 16 20:17:00.015517 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:17:00.015490 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j56zc_be3aeb91-80a8-4720-87d2-6479ec1370fc/ovnkube-controller/0.log" Apr 16 20:17:01.349676 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:17:01.349642 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-f7rhh_fba32900-cb28-4cb9-8c67-8874eb5f06ae/network-check-target-container/0.log" Apr 16 20:17:02.265773 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:17:02.265745 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h7tf4_28cb7eb4-d997-43b5-a1a2-73abb55230e3/iptables-alerter/0.log" Apr 16 20:17:02.909248 ip-10-0-136-138 kubenswrapper[2567]: I0416 20:17:02.909221 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-qldv9_f261d50e-6c86-49ca-ad32-2c77ac5ecb6a/tuned/0.log"