Apr 17 21:10:53.728109 ip-10-0-138-36 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 21:10:53.728123 ip-10-0-138-36 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 21:10:53.728133 ip-10-0-138-36 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 21:10:53.728570 ip-10-0-138-36 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 21:11:03.924242 ip-10-0-138-36 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 21:11:03.924259 ip-10-0-138-36 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f587857d1f21406ebcca6ae20b65b2ad -- Apr 17 21:13:08.680028 ip-10-0-138-36 systemd[1]: Starting Kubernetes Kubelet... Apr 17 21:13:09.127164 ip-10-0-138-36 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:13:09.127164 ip-10-0-138-36 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 21:13:09.127164 ip-10-0-138-36 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:13:09.127164 ip-10-0-138-36 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 21:13:09.127164 ip-10-0-138-36 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:13:09.128347 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.127766 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 21:13:09.130246 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130223 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:09.130246 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130246 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130250 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130255 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130258 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130262 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130265 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130268 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130270 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130273 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130276 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130279 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130282 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130285 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130288 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130291 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130307 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130310 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130314 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130316 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130320 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:09.130321 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130323 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130325 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130328 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130331 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130334 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130337 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130340 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130343 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130346 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130349 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130352 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130354 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130357 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130360 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130362 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130365 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130368 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130371 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130374 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130377 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:09.130857 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130379 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130382 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130385 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130388 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130390 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130393 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130395 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130397 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130400 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130403 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130405 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130408 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130411 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130414 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130417 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130419 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130422 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130432 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130435 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:09.131440 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130438 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130441 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130443 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130446 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130448 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130451 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130454 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130457 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130460 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130463 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130466 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130468 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130471 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130473 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130476 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130479 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130481 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130486 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130490 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:09.131952 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130494 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130497 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130500 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130503 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130507 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130511 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130515 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130967 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130975 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130978 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130981 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130983 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130986 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130989 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130992 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130994 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.130997 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131000 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131002 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:09.132435 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131006 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131008 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131011 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131014 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131016 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131019 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131022 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131025 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131030 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131033 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131037 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131039 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131042 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131045 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131048 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131051 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131053 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131056 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131059 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131062 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:09.132947 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131065 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131068 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131071 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131073 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131076 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131079 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131081 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131084 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131086 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131089 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131091 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131094 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131096 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131100 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131102 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131105 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131107 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131110 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131113 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131116 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:09.133489 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131119 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131121 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131124 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131126 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131128 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131131 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131134 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131136 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131138 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131141 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131144 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131146 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131149 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131153 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131155 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131158 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131161 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131163 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131166 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131168 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:09.134014 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131171 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131173 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131176 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131178 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131180 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131184 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131187 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131190 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131192 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131195 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131198 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131200 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131203 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131205 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131296 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131305 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131311 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131315 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131320 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131323 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131328 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 21:13:09.134561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131333 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131336 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131340 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131343 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131349 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131352 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131356 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131359 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131362 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131365 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131368 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131371 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131376 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131379 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131382 2576 flags.go:64] FLAG: --config-dir="" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131385 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131388 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131392 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131396 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131399 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131403 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131406 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131409 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131412 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131415 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 21:13:09.135104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131418 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131423 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131426 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131429 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131432 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131435 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131438 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131443 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131446 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131450 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131453 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131457 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131461 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131465 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131468 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131471 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131474 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131477 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131480 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131483 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131486 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131489 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131492 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131496 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131499 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 21:13:09.135734 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131502 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131506 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131509 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131513 2576 flags.go:64] FLAG: --help="false" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131516 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131519 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131522 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131525 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131528 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131533 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131535 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131538 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131541 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131544 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131547 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131550 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131558 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131561 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131564 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131568 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131571 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131574 2576 flags.go:64] FLAG: --lock-file="" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131576 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131580 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 21:13:09.136353 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131583 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131588 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131591 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131594 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131597 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131600 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131603 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131606 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131609 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131613 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131617 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131621 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131624 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131627 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131630 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131633 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131636 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131640 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131643 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131664 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131668 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131671 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131674 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 21:13:09.136984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131677 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131683 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131688 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131691 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131694 2576 flags.go:64] FLAG: --port="10250" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131698 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131700 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dcd1f19ac71c871b" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131704 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131707 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131710 2576 flags.go:64] FLAG: --register-node="true" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131713 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131716 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131720 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131723 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131726 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131729 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131733 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131736 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131739 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131742 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131746 2576 flags.go:64] FLAG: --runonce="false" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131749 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131753 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131756 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131759 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131762 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 21:13:09.137568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131765 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131769 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131772 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131775 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131778 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131781 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131784 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131788 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131792 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131795 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131800 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131803 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131807 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131812 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131815 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131817 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131820 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131823 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131827 2576 flags.go:64] FLAG: --v="2" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131831 2576 flags.go:64] FLAG: --version="false" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131836 2576 flags.go:64] FLAG: --vmodule="" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131840 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.131843 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131949 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:09.138251 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131953 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131956 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131959 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131962 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131967 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131970 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131973 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131975 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131978 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131981 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131983 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131986 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131988 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131991 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131993 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.131996 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132000 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132003 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132005 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:09.138861 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132008 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132011 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132014 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132016 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132019 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132022 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132025 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132027 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132030 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132032 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132035 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132038 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132042 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132045 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132047 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132050 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132052 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132056 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132059 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132061 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:09.139391 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132064 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132066 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132069 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132071 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132074 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132076 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132079 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132081 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132084 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132088 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132090 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132093 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132095 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132098 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132101 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132103 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132106 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132109 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132111 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132114 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:09.139918 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132116 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132120 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132123 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132126 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132130 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132133 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132136 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132139 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132142 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132145 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132148 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132150 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132153 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132155 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132158 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132161 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132163 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132166 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132169 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132171 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:09.140429 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132174 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132178 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132181 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132183 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132186 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.132189 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.132194 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.139025 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.139046 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139098 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139103 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139107 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139110 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139114 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139120 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:09.140974 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139123 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139126 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139129 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139132 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139135 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139138 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139141 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139144 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139147 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139150 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139153 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139156 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139158 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139161 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139163 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139166 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139169 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139171 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139174 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139177 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:09.141365 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139179 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139182 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139184 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139187 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139189 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139194 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139197 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139199 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139202 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139205 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139209 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139214 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139217 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139220 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139223 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139226 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139228 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139231 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139233 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139236 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:09.141876 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139238 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139241 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139244 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139246 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139249 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139252 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139254 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139257 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139259 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139262 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139264 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139267 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139269 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139272 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139274 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139277 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139280 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139284 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139288 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:09.142385 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139290 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139293 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139296 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139299 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139302 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139304 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139307 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139309 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139312 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139315 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139317 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139320 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139323 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139326 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139329 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139331 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139334 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139336 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139339 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139341 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:09.142916 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139344 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.139349 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139455 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139461 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139464 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139467 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139470 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139472 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139475 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139477 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139481 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139485 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139491 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139494 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139497 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:13:09.143415 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139500 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139502 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139505 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139508 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139511 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139513 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139516 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139518 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139521 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139523 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139526 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139528 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139531 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139534 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139537 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139539 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139542 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139544 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139547 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139550 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:13:09.143822 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139553 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139555 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139558 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139561 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139563 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139566 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139569 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139571 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139574 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139577 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139580 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139583 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139585 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139588 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139590 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139594 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139598 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139601 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139603 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:13:09.144347 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139606 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139609 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139612 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139615 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139617 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139620 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139623 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139625 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139628 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139630 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139633 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139636 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139639 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139641 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139644 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139646 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139664 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139668 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139672 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139676 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:13:09.144858 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139680 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139683 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139686 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139690 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139694 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139697 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139700 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139703 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139705 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139708 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139710 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139713 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139716 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:09.139718 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.139724 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:13:09.145369 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.140573 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 21:13:09.147225 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.147204 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 21:13:09.148145 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.148132 2576 server.go:1019] "Starting client certificate rotation" Apr 17 21:13:09.148275 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.148250 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:13:09.148996 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.148982 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:13:09.175840 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.175815 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:13:09.180726 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.180695 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:13:09.199113 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.199082 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 21:13:09.204971 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.204944 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:13:09.205092 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.205006 2576 log.go:25] "Validated CRI v1 image API" Apr 17 21:13:09.206885 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.206869 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 21:13:09.211596 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.211562 2576 fs.go:135] Filesystem UUIDs: map[1f1398cf-47e2-494e-ad4d-4e1833651dc3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e277626d-cb50-4bda-8e8e-947d1e70166c:/dev/nvme0n1p3] Apr 17 21:13:09.211672 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.211594 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 21:13:09.217374 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.217225 2576 manager.go:217] Machine: {Timestamp:2026-04-17 21:13:09.215505687 +0000 UTC m=+0.416998552 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3086363 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23f2d21eb02077226be928305637d7 SystemUUID:ec23f2d2-1eb0-2077-226b-e928305637d7 BootID:f587857d-1f21-406e-bcca-6ae20b65b2ad Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b3:96:3e:5e:ad Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b3:96:3e:5e:ad Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:44:cf:dd:5c:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 21:13:09.217374 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.217365 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 21:13:09.217536 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.217523 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 21:13:09.218578 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.218545 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 21:13:09.218752 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.218581 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-36.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 21:13:09.218838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.218763 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 21:13:09.218838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.218773 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 21:13:09.218838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.218787 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:13:09.220037 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.220023 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:13:09.220941 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.220928 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:13:09.221261 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.221249 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 21:13:09.223429 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.223415 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 21:13:09.223469 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.223434 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 21:13:09.223469 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.223458 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 21:13:09.223535 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.223473 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 21:13:09.223535 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.223483 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 21:13:09.224491 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.224468 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:13:09.224534 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.224497 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:13:09.227320 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.227288 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 21:13:09.231124 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.231092 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 21:13:09.232800 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232780 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 21:13:09.232800 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232804 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232813 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232822 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232829 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232835 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232841 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232847 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232854 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232861 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232870 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 21:13:09.232922 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.232878 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 21:13:09.233701 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.233690 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 21:13:09.233701 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.233701 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 21:13:09.236441 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.236405 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 21:13:09.236441 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.236423 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 21:13:09.237549 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.237535 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 21:13:09.237622 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.237581 2576 server.go:1295] "Started kubelet" Apr 17 21:13:09.237753 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.237680 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 21:13:09.237788 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.237735 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 21:13:09.237829 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.237815 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 21:13:09.238510 ip-10-0-138-36 systemd[1]: Started Kubernetes Kubelet. Apr 17 21:13:09.239288 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.239100 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 21:13:09.240372 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.240357 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 21:13:09.245171 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.245144 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 21:13:09.245171 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.245150 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 21:13:09.245814 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.245792 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 21:13:09.245814 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.245818 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 21:13:09.245998 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.245936 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 21:13:09.245998 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.245999 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 21:13:09.246092 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246004 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 21:13:09.246208 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246184 2576 factory.go:55] Registering systemd factory Apr 17 21:13:09.246208 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.246197 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.246208 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246214 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 21:13:09.246478 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246450 2576 factory.go:153] Registering CRI-O factory Apr 17 21:13:09.246478 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246471 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 21:13:09.246616 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246592 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 21:13:09.246704 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246621 2576 factory.go:103] Registering Raw factory Apr 17 21:13:09.246704 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.246639 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 21:13:09.247092 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.247071 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-36.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 21:13:09.247092 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.247075 2576 manager.go:319] Starting recovery of all containers Apr 17 21:13:09.248040 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.247179 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-36.ec2.internal.18a7414fda0e6de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-36.ec2.internal,UID:ip-10-0-138-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-36.ec2.internal,},FirstTimestamp:2026-04-17 21:13:09.237550565 +0000 UTC m=+0.439043430,LastTimestamp:2026-04-17 21:13:09.237550565 +0000 UTC m=+0.439043430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-36.ec2.internal,}" Apr 17 21:13:09.248383 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.248363 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 21:13:09.252163 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.252124 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 21:13:09.252334 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.252192 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 21:13:09.258351 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.258318 2576 manager.go:324] Recovery completed Apr 17 21:13:09.263345 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.263321 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:09.264482 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.264457 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2qvvl" Apr 17 21:13:09.265793 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.265778 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:09.265837 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.265813 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:09.265837 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.265828 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:09.266316 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.266303 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 21:13:09.266357 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.266317 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 21:13:09.266357 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.266337 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:13:09.267640 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.267566 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-36.ec2.internal.18a7414fdbbd69b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-36.ec2.internal,UID:ip-10-0-138-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-36.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-36.ec2.internal,},FirstTimestamp:2026-04-17 21:13:09.265795512 +0000 UTC m=+0.467288380,LastTimestamp:2026-04-17 21:13:09.265795512 +0000 UTC m=+0.467288380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-36.ec2.internal,}" Apr 17 21:13:09.269199 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.269180 2576 policy_none.go:49] "None policy: Start" Apr 17 21:13:09.269199 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.269203 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 21:13:09.269354 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.269218 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 21:13:09.271430 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.271410 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2qvvl" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.314451 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.314497 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.314508 2576 server.go:85] "Starting device plugin registration server" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.314827 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.314868 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.314980 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.315061 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.315070 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.315588 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 21:13:09.323258 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.315618 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.377868 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.377785 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 21:13:09.378982 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.378963 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 21:13:09.379044 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.378996 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 21:13:09.379044 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.379018 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 21:13:09.379044 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.379024 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 21:13:09.379143 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.379064 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 21:13:09.381277 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.381244 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:09.416002 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.415967 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:09.418256 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.418238 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:09.418312 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.418275 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:09.418312 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.418289 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:09.418383 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.418314 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.426448 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.426425 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.426510 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.426459 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-36.ec2.internal\": node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.443906 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.443875 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.479214 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.479176 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal"] Apr 17 21:13:09.479378 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.479262 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:09.480243 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.480224 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:09.480336 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.480260 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:09.480336 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.480278 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:09.482566 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.482546 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:09.482723 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.482707 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.482772 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.482742 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:09.483366 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.483350 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:09.483466 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.483384 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:09.483466 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.483397 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:09.483466 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.483352 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:09.483466 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.483437 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:09.483466 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.483450 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:09.485541 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.485514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.485673 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.485546 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:13:09.486363 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.486347 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:13:09.486434 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.486375 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:13:09.486434 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.486390 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:13:09.498342 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.498310 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-36.ec2.internal\" not found" node="ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.501892 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.501874 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-36.ec2.internal\" not found" node="ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.544695 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.544646 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.645100 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.645027 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.648433 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.648412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/19a3ac82673ab3918b84502042418413-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal\" (UID: \"19a3ac82673ab3918b84502042418413\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.648500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.648443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19a3ac82673ab3918b84502042418413-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal\" (UID: \"19a3ac82673ab3918b84502042418413\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.648500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.648466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2b7385231aa0ba2edc60e303dfabae0-config\") pod \"kube-apiserver-proxy-ip-10-0-138-36.ec2.internal\" (UID: \"a2b7385231aa0ba2edc60e303dfabae0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.745842 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.745808 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.749130 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.749103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/19a3ac82673ab3918b84502042418413-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal\" (UID: \"19a3ac82673ab3918b84502042418413\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.749130 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.749122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/19a3ac82673ab3918b84502042418413-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal\" (UID: \"19a3ac82673ab3918b84502042418413\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.749249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.749157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19a3ac82673ab3918b84502042418413-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal\" (UID: \"19a3ac82673ab3918b84502042418413\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.749249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.749182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2b7385231aa0ba2edc60e303dfabae0-config\") pod \"kube-apiserver-proxy-ip-10-0-138-36.ec2.internal\" (UID: \"a2b7385231aa0ba2edc60e303dfabae0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.749249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.749206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2b7385231aa0ba2edc60e303dfabae0-config\") pod \"kube-apiserver-proxy-ip-10-0-138-36.ec2.internal\" (UID: \"a2b7385231aa0ba2edc60e303dfabae0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.749249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.749239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19a3ac82673ab3918b84502042418413-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal\" (UID: \"19a3ac82673ab3918b84502042418413\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.800250 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.800211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.805209 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:09.805186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" Apr 17 21:13:09.846168 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.846131 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:09.946613 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:09.946529 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.047006 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:10.046969 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.147557 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:10.147511 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.148602 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.148580 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 21:13:10.148796 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.148777 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:13:10.245515 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.245468 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 21:13:10.247683 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:10.247643 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.262619 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.262589 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:13:10.274205 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.274164 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 21:08:09 +0000 UTC" deadline="2028-01-22 06:51:02.494083364 +0000 UTC" Apr 17 21:13:10.274205 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.274200 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15465h37m52.219886427s" Apr 17 21:13:10.286386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.286356 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-28ljm" Apr 17 21:13:10.293642 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.293617 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-28ljm" Apr 17 21:13:10.321043 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.321014 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:10.338199 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:10.338165 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19a3ac82673ab3918b84502042418413.slice/crio-a8c330b59a9f31a231413d7d4b5ef23c0b00dfa9e74bfe85845e8804fb4c6b53 WatchSource:0}: Error finding container a8c330b59a9f31a231413d7d4b5ef23c0b00dfa9e74bfe85845e8804fb4c6b53: Status 404 returned error can't find the container with id a8c330b59a9f31a231413d7d4b5ef23c0b00dfa9e74bfe85845e8804fb4c6b53 Apr 17 21:13:10.343310 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.343289 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:13:10.348239 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:10.348217 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.349665 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:10.349622 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b7385231aa0ba2edc60e303dfabae0.slice/crio-027747df3bf757906bd485d9b59a98f0d41376db4de06153a1d3f08387e5b2c6 WatchSource:0}: Error finding container 027747df3bf757906bd485d9b59a98f0d41376db4de06153a1d3f08387e5b2c6: Status 404 returned error can't find the container with id 027747df3bf757906bd485d9b59a98f0d41376db4de06153a1d3f08387e5b2c6 Apr 17 21:13:10.381880 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.381825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" event={"ID":"a2b7385231aa0ba2edc60e303dfabae0","Type":"ContainerStarted","Data":"027747df3bf757906bd485d9b59a98f0d41376db4de06153a1d3f08387e5b2c6"} Apr 17 21:13:10.382831 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.382807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" event={"ID":"19a3ac82673ab3918b84502042418413","Type":"ContainerStarted","Data":"a8c330b59a9f31a231413d7d4b5ef23c0b00dfa9e74bfe85845e8804fb4c6b53"} Apr 17 21:13:10.449230 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:10.449196 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.549906 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:10.549819 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-36.ec2.internal\" not found" Apr 17 21:13:10.596128 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.596103 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:10.617849 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.617823 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:10.645574 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.645531 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" Apr 17 21:13:10.657646 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.657609 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:13:10.659170 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.659149 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" Apr 17 21:13:10.665214 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:10.665191 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:13:11.224852 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.224797 2576 apiserver.go:52] "Watching apiserver" Apr 17 21:13:11.230627 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.230593 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:13:11.231917 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.231884 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 21:13:11.233030 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.232998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-xqgf5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54","openshift-multus/multus-additional-cni-plugins-fnlrt","openshift-multus/network-metrics-daemon-rcbth","openshift-network-diagnostics/network-check-target-2mm6m","openshift-ovn-kubernetes/ovnkube-node-wvwmf","kube-system/konnectivity-agent-xsg6r","kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal","openshift-cluster-node-tuning-operator/tuned-xzsg6","openshift-dns/node-resolver-dj96t","openshift-image-registry/node-ca-8bvhr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal","openshift-multus/multus-mjf2v"] Apr 17 21:13:11.236048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.236021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.238264 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.238230 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 21:13:11.238387 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.238268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.238387 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.238335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zkp65\"" Apr 17 21:13:11.238541 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.238514 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 21:13:11.241133 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.241103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lk4tq\"" Apr 17 21:13:11.241336 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.241317 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.241560 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.241411 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.241560 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.241415 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 21:13:11.243278 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.242977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245118 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.245340 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245354 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 21:13:11.245542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245404 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hv2kl\"" Apr 17 21:13:11.246016 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.245601 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.248037 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.248015 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:11.248135 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.248082 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:11.248198 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.248170 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.250694 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.250362 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 21:13:11.250694 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.250427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 21:13:11.250694 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.250461 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fdsrs\"" Apr 17 21:13:11.250929 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.250724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.250929 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.250899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.251459 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.251177 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.251459 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.251249 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 21:13:11.251459 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.251457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 21:13:11.253210 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.253176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.253342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.253319 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.253846 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.253677 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 21:13:11.253846 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.253691 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f872d\"" Apr 17 21:13:11.253846 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.253727 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.255815 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wd8\" (UniqueName: \"kubernetes.io/projected/8a39439a-b5a0-4399-975e-838c219449b7-kube-api-access-74wd8\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256064 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ntzjq\"" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-systemd-units\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22kz\" (UniqueName: \"kubernetes.io/projected/c72dca5f-90a2-417b-924c-22d40135ba3c-kube-api-access-h22kz\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e155af39-2618-4164-82af-86a051e4a586-agent-certs\") pod \"konnectivity-agent-xsg6r\" (UID: \"e155af39-2618-4164-82af-86a051e4a586\") " pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-cni-bin\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovn-node-metrics-cert\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovnkube-script-lib\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-var-lib-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:11.256937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.256978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-systemd\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-node-log\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-env-overrides\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjfbl\" (UniqueName: \"kubernetes.io/projected/10a9acf7-61cb-4537-a133-e83e8426fd8f-kube-api-access-zjfbl\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-socket-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-etc-selinux\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-system-cni-dir\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-etc-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-slash\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-log-socket\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.257847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-device-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-sys-fs\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-cni-netd\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-registration-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.257981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-kubelet\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-ovn\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-cnibin\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-os-release\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pn5c\" (UniqueName: \"kubernetes.io/projected/c8698539-048e-4326-92f2-2a5997c36c34-kube-api-access-5pn5c\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-run-netns\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovnkube-config\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.258386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e155af39-2618-4164-82af-86a051e4a586-konnectivity-ca\") pod \"konnectivity-agent-xsg6r\" (UID: \"e155af39-2618-4164-82af-86a051e4a586\") " pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.259048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.258403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.259815 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.259233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.259815 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.259282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.261112 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.261085 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.261324 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.261291 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 21:13:11.261493 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.261472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5qs5g\"" Apr 17 21:13:11.261869 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.261844 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.262288 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.262028 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 21:13:11.262288 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.262061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.262288 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.262186 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t8jv7\"" Apr 17 21:13:11.262485 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.262393 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 21:13:11.264110 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.264083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 21:13:11.264202 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.264149 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8cjps\"" Apr 17 21:13:11.294290 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.294257 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:08:10 +0000 UTC" deadline="2027-11-16 07:05:49.227840765 +0000 UTC" Apr 17 21:13:11.294436 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.294397 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13857h52m37.933448878s" Apr 17 21:13:11.346784 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.346757 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 21:13:11.358826 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-env-overrides\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.358977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-cni-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.358977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysctl-conf\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.358977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-sys-fs\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.358977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-kubernetes\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.358977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-cni-netd\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-serviceca\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.358994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-sys-fs\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-host-slash\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-cni-netd\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xmc\" (UniqueName: \"kubernetes.io/projected/73df3c3b-340e-459f-a30d-51085c37c69b-kube-api-access-79xmc\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dsz\" (UniqueName: \"kubernetes.io/projected/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-kube-api-access-f7dsz\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-cnibin\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-socket-dir-parent\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-ovn\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.359213 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-cnibin\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-ovn\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pn5c\" (UniqueName: \"kubernetes.io/projected/c8698539-048e-4326-92f2-2a5997c36c34-kube-api-access-5pn5c\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-cnibin\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-host\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e155af39-2618-4164-82af-86a051e4a586-konnectivity-ca\") pod \"konnectivity-agent-xsg6r\" (UID: \"e155af39-2618-4164-82af-86a051e4a586\") " pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.359707 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-env-overrides\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysconfig\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-var-lib-kubelet\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-systemd\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73df3c3b-340e-459f-a30d-51085c37c69b-hosts-file\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e155af39-2618-4164-82af-86a051e4a586-agent-certs\") pod \"konnectivity-agent-xsg6r\" (UID: \"e155af39-2618-4164-82af-86a051e4a586\") " pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e155af39-2618-4164-82af-86a051e4a586-konnectivity-ca\") pod \"konnectivity-agent-xsg6r\" (UID: \"e155af39-2618-4164-82af-86a051e4a586\") " pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.359917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-os-release\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.360055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-netns\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.360540 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360235 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 21:13:11.360540 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.360540 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.360643 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-modprobe-d\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.360643 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-cni-bin\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360643 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-socket-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-run\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-cni-bin\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-lib-modules\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-systemd\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-socket-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-node-log\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjfbl\" (UniqueName: \"kubernetes.io/projected/10a9acf7-61cb-4537-a133-e83e8426fd8f-kube-api-access-zjfbl\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.360800 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-systemd\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.360818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-etc-selinux\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-sys\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.360887 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:11.860854986 +0000 UTC m=+3.062347839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-etc-selinux\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-etc-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-slash\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-etc-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.360982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-log-socket\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-slash\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-device-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-node-log\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-device-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-cni-bin\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-log-socket\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-cni-multus\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.361224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-registration-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-k8s-cni-cncf-io\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-daemon-config\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c72dca5f-90a2-417b-924c-22d40135ba3c-registration-dir\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-host\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-run-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-kubelet\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-kubelet\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-os-release\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-etc-kubernetes\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-os-release\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-iptables-alerter-script\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-run-netns\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c8698539-048e-4326-92f2-2a5997c36c34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.361819 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovnkube-config\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-run-netns\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f845aa7c-34f0-4456-ae46-79b75dee87d0-cni-binary-copy\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-kubelet\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-multus-certs\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4npzs\" (UniqueName: \"kubernetes.io/projected/f845aa7c-34f0-4456-ae46-79b75dee87d0-kube-api-access-4npzs\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tjwl\" (UniqueName: \"kubernetes.io/projected/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-kube-api-access-9tjwl\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysctl-d\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74wd8\" (UniqueName: \"kubernetes.io/projected/8a39439a-b5a0-4399-975e-838c219449b7-kube-api-access-74wd8\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-systemd-units\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h22kz\" (UniqueName: \"kubernetes.io/projected/c72dca5f-90a2-417b-924c-22d40135ba3c-kube-api-access-h22kz\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-tuned\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34e31fa7-2e88-478a-8508-aa93c3b79f1d-tmp\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/73df3c3b-340e-459f-a30d-51085c37c69b-tmp-dir\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-conf-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj8dt\" (UniqueName: \"kubernetes.io/projected/34e31fa7-2e88-478a-8508-aa93c3b79f1d-kube-api-access-bj8dt\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovn-node-metrics-cert\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.362365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovnkube-config\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.361977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovnkube-script-lib\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-var-lib-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-systemd-units\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-var-lib-openvswitch\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-system-cni-dir\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-system-cni-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-hostroot\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10a9acf7-61cb-4537-a133-e83e8426fd8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.362617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8698539-048e-4326-92f2-2a5997c36c34-system-cni-dir\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.363075 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.363070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovnkube-script-lib\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.366155 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.364024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e155af39-2618-4164-82af-86a051e4a586-agent-certs\") pod \"konnectivity-agent-xsg6r\" (UID: \"e155af39-2618-4164-82af-86a051e4a586\") " pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.366423 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.366397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/10a9acf7-61cb-4537-a133-e83e8426fd8f-ovn-node-metrics-cert\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.368550 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.368482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pn5c\" (UniqueName: \"kubernetes.io/projected/c8698539-048e-4326-92f2-2a5997c36c34-kube-api-access-5pn5c\") pod \"multus-additional-cni-plugins-fnlrt\" (UID: \"c8698539-048e-4326-92f2-2a5997c36c34\") " pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.369976 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.369935 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:11.369976 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.369960 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:11.369976 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.369974 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:11.370187 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.370064 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:13:11.870048161 +0000 UTC m=+3.071541013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:11.371895 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.371855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjfbl\" (UniqueName: \"kubernetes.io/projected/10a9acf7-61cb-4537-a133-e83e8426fd8f-kube-api-access-zjfbl\") pod \"ovnkube-node-wvwmf\" (UID: \"10a9acf7-61cb-4537-a133-e83e8426fd8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.373116 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.373091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22kz\" (UniqueName: \"kubernetes.io/projected/c72dca5f-90a2-417b-924c-22d40135ba3c-kube-api-access-h22kz\") pod \"aws-ebs-csi-driver-node-29f54\" (UID: \"c72dca5f-90a2-417b-924c-22d40135ba3c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.373203 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.373162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wd8\" (UniqueName: \"kubernetes.io/projected/8a39439a-b5a0-4399-975e-838c219449b7-kube-api-access-74wd8\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.463568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-os-release\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-netns\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-modprobe-d\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-run\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-lib-modules\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-sys\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-cni-bin\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-cni-multus\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-k8s-cni-cncf-io\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-daemon-config\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-host\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-etc-kubernetes\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-iptables-alerter-script\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.463838 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f845aa7c-34f0-4456-ae46-79b75dee87d0-cni-binary-copy\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-sys\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-kubelet\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-etc-kubernetes\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463906 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-cni-multus\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-multus-certs\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-multus-certs\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4npzs\" (UniqueName: \"kubernetes.io/projected/f845aa7c-34f0-4456-ae46-79b75dee87d0-kube-api-access-4npzs\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tjwl\" (UniqueName: \"kubernetes.io/projected/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-kube-api-access-9tjwl\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysctl-d\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-kubelet\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.463899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-netns\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-modprobe-d\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-tuned\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34e31fa7-2e88-478a-8508-aa93c3b79f1d-tmp\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-os-release\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-run\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-lib-modules\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.464435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysctl-d\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f845aa7c-34f0-4456-ae46-79b75dee87d0-cni-binary-copy\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-daemon-config\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-run-k8s-cni-cncf-io\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-host\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/73df3c3b-340e-459f-a30d-51085c37c69b-tmp-dir\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-conf-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464602 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-host-var-lib-cni-bin\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj8dt\" (UniqueName: \"kubernetes.io/projected/34e31fa7-2e88-478a-8508-aa93c3b79f1d-kube-api-access-bj8dt\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-system-cni-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-hostroot\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-cni-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysctl-conf\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-kubernetes\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-serviceca\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-host-slash\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79xmc\" (UniqueName: \"kubernetes.io/projected/73df3c3b-340e-459f-a30d-51085c37c69b-kube-api-access-79xmc\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/73df3c3b-340e-459f-a30d-51085c37c69b-tmp-dir\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.465342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-cni-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dsz\" (UniqueName: \"kubernetes.io/projected/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-kube-api-access-f7dsz\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-cnibin\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-system-cni-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-socket-dir-parent\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.464982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-hostroot\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-host\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-iptables-alerter-script\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysconfig\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-var-lib-kubelet\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-systemd\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-socket-dir-parent\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73df3c3b-340e-459f-a30d-51085c37c69b-hosts-file\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysctl-conf\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73df3c3b-340e-459f-a30d-51085c37c69b-hosts-file\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-kubernetes\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-host\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-multus-conf-dir\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.466770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f845aa7c-34f0-4456-ae46-79b75dee87d0-cnibin\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-sysconfig\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-systemd\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-host-slash\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34e31fa7-2e88-478a-8508-aa93c3b79f1d-var-lib-kubelet\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.465693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-serviceca\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.466723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34e31fa7-2e88-478a-8508-aa93c3b79f1d-etc-tuned\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.467744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.466779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34e31fa7-2e88-478a-8508-aa93c3b79f1d-tmp\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.472363 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.472333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4npzs\" (UniqueName: \"kubernetes.io/projected/f845aa7c-34f0-4456-ae46-79b75dee87d0-kube-api-access-4npzs\") pod \"multus-mjf2v\" (UID: \"f845aa7c-34f0-4456-ae46-79b75dee87d0\") " pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.472500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.472402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dsz\" (UniqueName: \"kubernetes.io/projected/becd6e32-bdde-40bf-bef7-c2f14ff29b2b-kube-api-access-f7dsz\") pod \"node-ca-8bvhr\" (UID: \"becd6e32-bdde-40bf-bef7-c2f14ff29b2b\") " pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.472614 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.472536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xmc\" (UniqueName: \"kubernetes.io/projected/73df3c3b-340e-459f-a30d-51085c37c69b-kube-api-access-79xmc\") pod \"node-resolver-dj96t\" (UID: \"73df3c3b-340e-459f-a30d-51085c37c69b\") " pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.472789 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.472770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tjwl\" (UniqueName: \"kubernetes.io/projected/be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7-kube-api-access-9tjwl\") pod \"iptables-alerter-xqgf5\" (UID: \"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7\") " pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.472836 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.472773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj8dt\" (UniqueName: \"kubernetes.io/projected/34e31fa7-2e88-478a-8508-aa93c3b79f1d-kube-api-access-bj8dt\") pod \"tuned-xzsg6\" (UID: \"34e31fa7-2e88-478a-8508-aa93c3b79f1d\") " pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.549816 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.549777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:11.561713 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.561676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" Apr 17 21:13:11.571607 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.571578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" Apr 17 21:13:11.578358 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.578330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:11.586121 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.586083 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xqgf5" Apr 17 21:13:11.594850 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.594820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" Apr 17 21:13:11.601593 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.601559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8bvhr" Apr 17 21:13:11.608308 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.608274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dj96t" Apr 17 21:13:11.612982 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.612958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mjf2v" Apr 17 21:13:11.867183 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.867085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:11.867354 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.867233 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:11.867354 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.867304 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:12.867284064 +0000 UTC m=+4.068776921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:11.968342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:11.968308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:11.968502 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.968470 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:11.968561 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.968503 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:11.968561 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.968516 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:11.968636 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:11.968571 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:13:12.968556316 +0000 UTC m=+4.170049172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:11.989767 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:11.989731 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe10bfaa_e0ca_4b7d_afef_d043cb7c9ba7.slice/crio-3864489e2984f41e5e10182276ecc8f6c5a9d3afa13a5e86626171fd85d3b1a1 WatchSource:0}: Error finding container 3864489e2984f41e5e10182276ecc8f6c5a9d3afa13a5e86626171fd85d3b1a1: Status 404 returned error can't find the container with id 3864489e2984f41e5e10182276ecc8f6c5a9d3afa13a5e86626171fd85d3b1a1 Apr 17 21:13:11.990780 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:11.990744 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecd6e32_bdde_40bf_bef7_c2f14ff29b2b.slice/crio-375f5ae829018906701832433322a4b318e1fc34010dd04d7530ee0d46816d8e WatchSource:0}: Error finding container 375f5ae829018906701832433322a4b318e1fc34010dd04d7530ee0d46816d8e: Status 404 returned error can't find the container with id 375f5ae829018906701832433322a4b318e1fc34010dd04d7530ee0d46816d8e Apr 17 21:13:11.992768 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:11.992745 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a9acf7_61cb_4537_a133_e83e8426fd8f.slice/crio-ea470a6bca78c324a5a26f6c1c588c96a7de24f2f699aaa3ed7d832eff6037dd WatchSource:0}: Error finding container ea470a6bca78c324a5a26f6c1c588c96a7de24f2f699aaa3ed7d832eff6037dd: Status 404 returned error can't find the container with id ea470a6bca78c324a5a26f6c1c588c96a7de24f2f699aaa3ed7d832eff6037dd Apr 17 21:13:11.996499 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:11.996460 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8698539_048e_4326_92f2_2a5997c36c34.slice/crio-6e3372609c3941e176f7859b1662dc948ee526213a2458020aecb5baf5d6a6d9 WatchSource:0}: Error finding container 6e3372609c3941e176f7859b1662dc948ee526213a2458020aecb5baf5d6a6d9: Status 404 returned error can't find the container with id 6e3372609c3941e176f7859b1662dc948ee526213a2458020aecb5baf5d6a6d9 Apr 17 21:13:11.997225 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:11.997199 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72dca5f_90a2_417b_924c_22d40135ba3c.slice/crio-078090651a9b9a801d572ea616f170f34ff22ed62c11f06101e87564bd3ecfad WatchSource:0}: Error finding container 078090651a9b9a801d572ea616f170f34ff22ed62c11f06101e87564bd3ecfad: Status 404 returned error can't find the container with id 078090651a9b9a801d572ea616f170f34ff22ed62c11f06101e87564bd3ecfad Apr 17 21:13:12.019263 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:12.019235 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e31fa7_2e88_478a_8508_aa93c3b79f1d.slice/crio-7360372a29c7a5fb207b6a5a31adf9cef3089a89fed0e8c0e5dd3325099d5d9d WatchSource:0}: Error finding container 7360372a29c7a5fb207b6a5a31adf9cef3089a89fed0e8c0e5dd3325099d5d9d: Status 404 returned error can't find the container with id 7360372a29c7a5fb207b6a5a31adf9cef3089a89fed0e8c0e5dd3325099d5d9d Apr 17 21:13:12.019792 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:12.019766 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73df3c3b_340e_459f_a30d_51085c37c69b.slice/crio-a562b193785ec834517aaf54ca0ff744efea57cdd1fba2d5b4b88e7e6c0155bd WatchSource:0}: Error finding container a562b193785ec834517aaf54ca0ff744efea57cdd1fba2d5b4b88e7e6c0155bd: Status 404 returned error can't find the container with id a562b193785ec834517aaf54ca0ff744efea57cdd1fba2d5b4b88e7e6c0155bd Apr 17 21:13:12.021099 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:12.021079 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode155af39_2618_4164_82af_86a051e4a586.slice/crio-f55027d0b82cf4fd3767accfe67d85e69726aca6078cdfd28ed016aec7010b73 WatchSource:0}: Error finding container f55027d0b82cf4fd3767accfe67d85e69726aca6078cdfd28ed016aec7010b73: Status 404 returned error can't find the container with id f55027d0b82cf4fd3767accfe67d85e69726aca6078cdfd28ed016aec7010b73 Apr 17 21:13:12.022109 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:12.022087 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf845aa7c_34f0_4456_ae46_79b75dee87d0.slice/crio-ba837933a909683dda48ca42f3651f3f0a420d5fd72e1fbdce4c4bcfe6e922e5 WatchSource:0}: Error finding container ba837933a909683dda48ca42f3651f3f0a420d5fd72e1fbdce4c4bcfe6e922e5: Status 404 returned error can't find the container with id ba837933a909683dda48ca42f3651f3f0a420d5fd72e1fbdce4c4bcfe6e922e5 Apr 17 21:13:12.295275 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.295189 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:08:10 +0000 UTC" deadline="2027-12-01 21:17:24.679605064 +0000 UTC" Apr 17 21:13:12.295275 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.295223 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14232h4m12.384384942s" Apr 17 21:13:12.379927 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.379897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:12.380096 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.380011 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:12.386152 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.386119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xsg6r" event={"ID":"e155af39-2618-4164-82af-86a051e4a586","Type":"ContainerStarted","Data":"f55027d0b82cf4fd3767accfe67d85e69726aca6078cdfd28ed016aec7010b73"} Apr 17 21:13:12.387031 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.387003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mjf2v" event={"ID":"f845aa7c-34f0-4456-ae46-79b75dee87d0","Type":"ContainerStarted","Data":"ba837933a909683dda48ca42f3651f3f0a420d5fd72e1fbdce4c4bcfe6e922e5"} Apr 17 21:13:12.388009 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.387981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" event={"ID":"34e31fa7-2e88-478a-8508-aa93c3b79f1d","Type":"ContainerStarted","Data":"7360372a29c7a5fb207b6a5a31adf9cef3089a89fed0e8c0e5dd3325099d5d9d"} Apr 17 21:13:12.393759 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.393732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerStarted","Data":"6e3372609c3941e176f7859b1662dc948ee526213a2458020aecb5baf5d6a6d9"} Apr 17 21:13:12.394620 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.394592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8bvhr" event={"ID":"becd6e32-bdde-40bf-bef7-c2f14ff29b2b","Type":"ContainerStarted","Data":"375f5ae829018906701832433322a4b318e1fc34010dd04d7530ee0d46816d8e"} Apr 17 21:13:12.395516 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.395495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xqgf5" event={"ID":"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7","Type":"ContainerStarted","Data":"3864489e2984f41e5e10182276ecc8f6c5a9d3afa13a5e86626171fd85d3b1a1"} Apr 17 21:13:12.396965 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.396942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" event={"ID":"a2b7385231aa0ba2edc60e303dfabae0","Type":"ContainerStarted","Data":"85fd57e58a3ec1a6f77296aef98ea8a207114417c2d8e1c0aa4e9d3738a05eec"} Apr 17 21:13:12.397862 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.397841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dj96t" event={"ID":"73df3c3b-340e-459f-a30d-51085c37c69b","Type":"ContainerStarted","Data":"a562b193785ec834517aaf54ca0ff744efea57cdd1fba2d5b4b88e7e6c0155bd"} Apr 17 21:13:12.401100 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.401073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" event={"ID":"c72dca5f-90a2-417b-924c-22d40135ba3c","Type":"ContainerStarted","Data":"078090651a9b9a801d572ea616f170f34ff22ed62c11f06101e87564bd3ecfad"} Apr 17 21:13:12.402057 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.402025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"ea470a6bca78c324a5a26f6c1c588c96a7de24f2f699aaa3ed7d832eff6037dd"} Apr 17 21:13:12.410961 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.410909 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-36.ec2.internal" podStartSLOduration=2.410893291 podStartE2EDuration="2.410893291s" podCreationTimestamp="2026-04-17 21:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:13:12.41082016 +0000 UTC m=+3.612313038" watchObservedRunningTime="2026-04-17 21:13:12.410893291 +0000 UTC m=+3.612386167" Apr 17 21:13:12.874983 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.874939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:12.875181 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.875107 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:12.875181 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.875171 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:14.875151899 +0000 UTC m=+6.076644754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:12.976235 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:12.976133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:12.976400 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.976387 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:12.976456 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.976410 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:12.976456 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.976423 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:12.976582 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:12.976485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:13:14.976465221 +0000 UTC m=+6.177958077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:13.380223 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:13.380138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:13.380696 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:13.380265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:13.414737 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:13.414698 2576 generic.go:358] "Generic (PLEG): container finished" podID="19a3ac82673ab3918b84502042418413" containerID="2d2bfd8c86b1b06d868bc2c9d6601ad8e765b67cdea5f8379318b371856e73e3" exitCode=0 Apr 17 21:13:13.415741 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:13.415715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" event={"ID":"19a3ac82673ab3918b84502042418413","Type":"ContainerDied","Data":"2d2bfd8c86b1b06d868bc2c9d6601ad8e765b67cdea5f8379318b371856e73e3"} Apr 17 21:13:14.380025 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:14.379988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:14.380229 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.380139 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:14.428385 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:14.428344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" event={"ID":"19a3ac82673ab3918b84502042418413","Type":"ContainerStarted","Data":"69b6d7d3dcd8b2855000624fe5ee01a994fa5ff63562118d380a7d23fce26649"} Apr 17 21:13:14.894350 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:14.894276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:14.894549 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.894465 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:14.894549 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.894529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:18.894510981 +0000 UTC m=+10.096003836 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:14.995506 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:14.995469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:14.995729 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.995710 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:14.995819 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.995733 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:14.995819 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.995745 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:14.995819 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:14.995811 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:13:18.995790615 +0000 UTC m=+10.197283485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:15.380139 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:15.380105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:15.380318 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:15.380234 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:16.379687 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:16.379639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:16.380131 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:16.379801 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:17.009118 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.009046 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-36.ec2.internal" podStartSLOduration=7.00902597 podStartE2EDuration="7.00902597s" podCreationTimestamp="2026-04-17 21:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:13:14.443320955 +0000 UTC m=+5.644813894" watchObservedRunningTime="2026-04-17 21:13:17.00902597 +0000 UTC m=+8.210518845" Apr 17 21:13:17.009520 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.009499 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-txkcf"] Apr 17 21:13:17.020208 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.020172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.020379 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:17.020266 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:17.115027 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.114984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.115208 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.115126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f888011-8182-4d16-8a26-05b9e1670eb0-dbus\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.115208 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.115185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f888011-8182-4d16-8a26-05b9e1670eb0-kubelet-config\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.215924 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.215868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.216114 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.215949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f888011-8182-4d16-8a26-05b9e1670eb0-dbus\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.216114 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.215996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f888011-8182-4d16-8a26-05b9e1670eb0-kubelet-config\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.216114 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:17.216046 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:17.216114 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.216084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f888011-8182-4d16-8a26-05b9e1670eb0-kubelet-config\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.216304 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:17.216130 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret podName:2f888011-8182-4d16-8a26-05b9e1670eb0 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:17.716108425 +0000 UTC m=+8.917601282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret") pod "global-pull-secret-syncer-txkcf" (UID: "2f888011-8182-4d16-8a26-05b9e1670eb0") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:17.216304 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.216170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f888011-8182-4d16-8a26-05b9e1670eb0-dbus\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.380293 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.380217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:17.380778 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:17.380348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:17.720545 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:17.720449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:17.720808 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:17.720665 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:17.720808 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:17.720752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret podName:2f888011-8182-4d16-8a26-05b9e1670eb0 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:18.720731913 +0000 UTC m=+9.922224769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret") pod "global-pull-secret-syncer-txkcf" (UID: "2f888011-8182-4d16-8a26-05b9e1670eb0") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:18.379876 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:18.379839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:18.380055 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:18.379984 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:18.728886 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:18.728799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:18.729313 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:18.728973 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:18.729313 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:18.729063 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret podName:2f888011-8182-4d16-8a26-05b9e1670eb0 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:20.729042369 +0000 UTC m=+11.930535223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret") pod "global-pull-secret-syncer-txkcf" (UID: "2f888011-8182-4d16-8a26-05b9e1670eb0") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:18.930806 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:18.930767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:18.930966 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:18.930924 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:18.931019 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:18.930985 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:26.930967545 +0000 UTC m=+18.132460402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:19.032090 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:19.032001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:19.032297 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:19.032187 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:19.032297 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:19.032212 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:19.032297 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:19.032226 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:19.032297 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:19.032295 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:13:27.032276034 +0000 UTC m=+18.233768910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:19.381751 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:19.380977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:19.381751 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:19.381084 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:19.381751 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:19.381510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:19.381751 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:19.381606 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:20.379743 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:20.379706 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:20.380178 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:20.379849 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:20.742816 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:20.742730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:20.742965 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:20.742869 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:20.742965 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:20.742928 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret podName:2f888011-8182-4d16-8a26-05b9e1670eb0 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:24.742911777 +0000 UTC m=+15.944404629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret") pod "global-pull-secret-syncer-txkcf" (UID: "2f888011-8182-4d16-8a26-05b9e1670eb0") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:21.380498 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.380210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:21.381252 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:21.380554 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:21.381252 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.380250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:21.381252 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:21.380825 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:21.444680 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.444629 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8698539-048e-4326-92f2-2a5997c36c34" containerID="6f391115ead181d7b09ecb41fe6227d5d2890ed5c64dec4e76849cc5ae6877f1" exitCode=0 Apr 17 21:13:21.444874 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.444750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerDied","Data":"6f391115ead181d7b09ecb41fe6227d5d2890ed5c64dec4e76849cc5ae6877f1"} Apr 17 21:13:21.446772 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.446748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8bvhr" event={"ID":"becd6e32-bdde-40bf-bef7-c2f14ff29b2b","Type":"ContainerStarted","Data":"d4d92bac7d69ba2a878f7ed46f8d8663e8ef41c5b5d4405d775f88a9fd36a3ce"} Apr 17 21:13:21.448965 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.448929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dj96t" event={"ID":"73df3c3b-340e-459f-a30d-51085c37c69b","Type":"ContainerStarted","Data":"bf49e3c0b57acc58484768a7f0abfcbfd7d8b6aab38aaad55617fe1f6bdf1ad4"} Apr 17 21:13:21.450995 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.450967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" event={"ID":"c72dca5f-90a2-417b-924c-22d40135ba3c","Type":"ContainerStarted","Data":"65aa14398d509fa5ac2ce719ad198b32ab8055ed8837a310c4184640fa2a800f"} Apr 17 21:13:21.452678 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.452626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xsg6r" event={"ID":"e155af39-2618-4164-82af-86a051e4a586","Type":"ContainerStarted","Data":"c325c5775d435f2a2389c28b8959d455ee09081597acea3d6cd75c4ac087df58"} Apr 17 21:13:21.454596 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.454574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" event={"ID":"34e31fa7-2e88-478a-8508-aa93c3b79f1d","Type":"ContainerStarted","Data":"fc78c7a1ad9106973148dd3ab172c460a75ed737cd327b3477db679ac4b28343"} Apr 17 21:13:21.477709 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.477639 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xzsg6" podStartSLOduration=3.6162806830000003 podStartE2EDuration="12.477625978s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:12.021019247 +0000 UTC m=+3.222512100" lastFinishedPulling="2026-04-17 21:13:20.88236454 +0000 UTC m=+12.083857395" observedRunningTime="2026-04-17 21:13:21.47761724 +0000 UTC m=+12.679110116" watchObservedRunningTime="2026-04-17 21:13:21.477625978 +0000 UTC m=+12.679118852" Apr 17 21:13:21.501621 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.501554 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xsg6r" podStartSLOduration=3.644111554 podStartE2EDuration="12.501536234s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:12.025295997 +0000 UTC m=+3.226788849" lastFinishedPulling="2026-04-17 21:13:20.882720671 +0000 UTC m=+12.084213529" observedRunningTime="2026-04-17 21:13:21.500840537 +0000 UTC m=+12.702333411" watchObservedRunningTime="2026-04-17 21:13:21.501536234 +0000 UTC m=+12.703029109" Apr 17 21:13:21.501846 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:21.501811 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dj96t" podStartSLOduration=3.641205244 podStartE2EDuration="12.501800961s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:12.02176779 +0000 UTC m=+3.223260648" lastFinishedPulling="2026-04-17 21:13:20.882363495 +0000 UTC m=+12.083856365" observedRunningTime="2026-04-17 21:13:21.488801061 +0000 UTC m=+12.690293934" watchObservedRunningTime="2026-04-17 21:13:21.501800961 +0000 UTC m=+12.703293837" Apr 17 21:13:22.379920 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:22.379802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:22.380087 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:22.379958 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:22.458126 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:22.458088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xqgf5" event={"ID":"be10bfaa-e0ca-4b7d-afef-d043cb7c9ba7","Type":"ContainerStarted","Data":"e3ab2a15721407838ac20f22c9b94b74bdd3b653d217e654869ba9119dba82e6"} Apr 17 21:13:22.471809 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:22.471751 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8bvhr" podStartSLOduration=4.584827075 podStartE2EDuration="13.471733494s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:11.994125027 +0000 UTC m=+3.195617893" lastFinishedPulling="2026-04-17 21:13:20.881031457 +0000 UTC m=+12.082524312" observedRunningTime="2026-04-17 21:13:21.523115734 +0000 UTC m=+12.724608609" watchObservedRunningTime="2026-04-17 21:13:22.471733494 +0000 UTC m=+13.673226367" Apr 17 21:13:22.472008 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:22.471971 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xqgf5" podStartSLOduration=4.581169897 podStartE2EDuration="13.471961968s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:11.994301405 +0000 UTC m=+3.195794270" lastFinishedPulling="2026-04-17 21:13:20.885093481 +0000 UTC m=+12.086586341" observedRunningTime="2026-04-17 21:13:22.471590203 +0000 UTC m=+13.673083077" watchObservedRunningTime="2026-04-17 21:13:22.471961968 +0000 UTC m=+13.673454843" Apr 17 21:13:23.263886 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:23.263850 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:23.264764 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:23.264742 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:23.380150 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:23.380098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:23.380319 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:23.380232 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:23.380399 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:23.380361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:23.380551 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:23.380498 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:24.379890 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:24.379850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:24.380335 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:24.380002 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:24.461338 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:24.461310 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 21:13:24.775308 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:24.775234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:24.775469 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:24.775344 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:24.775469 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:24.775409 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret podName:2f888011-8182-4d16-8a26-05b9e1670eb0 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:32.77539176 +0000 UTC m=+23.976884615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret") pod "global-pull-secret-syncer-txkcf" (UID: "2f888011-8182-4d16-8a26-05b9e1670eb0") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:25.379530 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:25.379265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:25.379728 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:25.379310 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:25.379728 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:25.379674 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:25.379867 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:25.379774 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:26.379683 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:26.379628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:26.380220 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:26.379795 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:26.506022 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:26.505982 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:26.506185 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:26.506107 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 21:13:26.506694 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:26.506674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xsg6r" Apr 17 21:13:26.992376 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:26.992344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:26.992580 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:26.992530 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:26.992634 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:26.992602 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:42.992581665 +0000 UTC m=+34.194074516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:27.093312 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:27.093277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:27.093481 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:27.093424 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:27.093481 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:27.093439 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:27.093481 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:27.093448 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:27.093606 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:27.093497 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:13:43.093482204 +0000 UTC m=+34.294975056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:27.379508 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:27.379422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:27.379686 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:27.379422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:27.379686 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:27.379548 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:27.380158 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:27.379677 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:28.379673 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:28.379612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:28.379869 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:28.379754 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:29.380397 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:29.380358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:29.380832 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:29.380477 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:29.380832 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:29.380551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:29.380832 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:29.380681 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:30.380062 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:30.380030 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:30.380326 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:30.380170 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:31.379296 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:31.379256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:31.379863 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:31.379265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:31.379863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:31.379371 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:31.379863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:31.379833 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:32.379492 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:32.379460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:32.379862 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:32.379575 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:32.600474 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:32.600441 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 21:13:32.830216 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:32.829999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:32.830347 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:32.830155 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:32.830398 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:32.830349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret podName:2f888011-8182-4d16-8a26-05b9e1670eb0 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:48.83033396 +0000 UTC m=+40.031826815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret") pod "global-pull-secret-syncer-txkcf" (UID: "2f888011-8182-4d16-8a26-05b9e1670eb0") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:13:33.328210 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.328011 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T21:13:32.600473422Z","UUID":"97441385-672d-4dec-88d8-c666d6c5e81d","Handler":null,"Name":"","Endpoint":""} Apr 17 21:13:33.331044 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.331017 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 21:13:33.331172 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.331053 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 21:13:33.380095 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.380063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:33.380875 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.380061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:33.380875 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:33.380195 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:33.380875 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:33.380303 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:33.479869 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.479834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mjf2v" event={"ID":"f845aa7c-34f0-4456-ae46-79b75dee87d0","Type":"ContainerStarted","Data":"145e7e2e3b6dff7a97ab75bd9ef9b3d96d5bcb886269b469351117284d764c6f"} Apr 17 21:13:33.481483 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.481460 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8698539-048e-4326-92f2-2a5997c36c34" containerID="d19f15f0714d1b5ebcdc9b1ed9cb8152d111c8345bf4c305aa046f25f8ba8ea1" exitCode=0 Apr 17 21:13:33.481602 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.481534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerDied","Data":"d19f15f0714d1b5ebcdc9b1ed9cb8152d111c8345bf4c305aa046f25f8ba8ea1"} Apr 17 21:13:33.483101 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.483078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" event={"ID":"c72dca5f-90a2-417b-924c-22d40135ba3c","Type":"ContainerStarted","Data":"d2435147a746e28b9ffdbf441fa82c7fe633a209b19f75f74c42d4a4dd56a235"} Apr 17 21:13:33.485495 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.485474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"62336c88df574b6230d7ba6b7fdb229bfe0a3fd123480fe50da2b91b748c45e5"} Apr 17 21:13:33.485587 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.485500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"63fe56d53da4b0e78fe89f1c359ec1c655a8b6f44dc669215f24a4ac05e1e181"} Apr 17 21:13:33.485587 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.485509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"71ae647510d76bbc119c0d0c3c793e3573344513467c4b3bef6b1a7ae2aeb713"} Apr 17 21:13:33.485587 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.485518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"0ceee3f9f5d1599f09092dae6c842be60c7a4d853cb7967e7cb898d16be34ed9"} Apr 17 21:13:33.485587 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.485525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"406c09d74bcdcbc96b721e3160fd7683ba1d7f1207a6950aebaf7109a90cf107"} Apr 17 21:13:33.485587 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.485533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"5dbceb91c7bfd6b3c57a590b44ea7eca7d07d8c1c17a31031ffd302b3e8885e6"} Apr 17 21:13:33.521596 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:33.521549 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mjf2v" podStartSLOduration=4.121843983 podStartE2EDuration="24.521534863s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:12.025360623 +0000 UTC m=+3.226853475" lastFinishedPulling="2026-04-17 21:13:32.425051489 +0000 UTC m=+23.626544355" observedRunningTime="2026-04-17 21:13:33.503875159 +0000 UTC m=+24.705368033" watchObservedRunningTime="2026-04-17 21:13:33.521534863 +0000 UTC m=+24.723027736" Apr 17 21:13:34.380186 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:34.380147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:34.380700 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:34.380279 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:34.488863 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:34.488830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" event={"ID":"c72dca5f-90a2-417b-924c-22d40135ba3c","Type":"ContainerStarted","Data":"760be98236af88768a07e994035c1fc0f3e8f79744efceb11664792d9a0f43b0"} Apr 17 21:13:34.503623 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:34.503572 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29f54" podStartSLOduration=3.693728243 podStartE2EDuration="25.50355815s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:12.016940952 +0000 UTC m=+3.218433807" lastFinishedPulling="2026-04-17 21:13:33.82677086 +0000 UTC m=+25.028263714" observedRunningTime="2026-04-17 21:13:34.503391256 +0000 UTC m=+25.704884129" watchObservedRunningTime="2026-04-17 21:13:34.50355815 +0000 UTC m=+25.705051073" Apr 17 21:13:35.379876 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:35.379793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:35.380033 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:35.379881 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:35.380033 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:35.379966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:35.380139 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:35.380066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:35.492637 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:35.492604 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8698539-048e-4326-92f2-2a5997c36c34" containerID="f87157fa25e309d3e2a2fc9742dd7b76636aeae33445f09f9b5803cac5aa232d" exitCode=0 Apr 17 21:13:35.493094 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:35.492686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerDied","Data":"f87157fa25e309d3e2a2fc9742dd7b76636aeae33445f09f9b5803cac5aa232d"} Apr 17 21:13:35.495513 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:35.495491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"2b6c1ef87b02e80cce744668a1b0e14edf6488e76586b1334358778edde28ef9"} Apr 17 21:13:36.379414 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:36.379388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:36.379544 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:36.379526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:36.499589 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:36.499550 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8698539-048e-4326-92f2-2a5997c36c34" containerID="1a25d6d27330e8429ba0ee26c5e8a0e0558bb8b7b92c41894d8f4ecf6e153477" exitCode=0 Apr 17 21:13:36.499968 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:36.499594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerDied","Data":"1a25d6d27330e8429ba0ee26c5e8a0e0558bb8b7b92c41894d8f4ecf6e153477"} Apr 17 21:13:37.379774 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:37.379732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:37.379774 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:37.379755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:37.380007 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:37.379882 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:37.380112 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:37.380026 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:37.505067 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:37.504940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" event={"ID":"10a9acf7-61cb-4537-a133-e83e8426fd8f","Type":"ContainerStarted","Data":"ccffd5ae8a2afee34a45871848a7df6a98d538bffda701cc9aa471d45fb86b13"} Apr 17 21:13:37.505503 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:37.505278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:37.522761 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:37.522732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:37.531169 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:37.531104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" podStartSLOduration=8.101855068 podStartE2EDuration="28.531085672s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:11.995408273 +0000 UTC m=+3.196901140" lastFinishedPulling="2026-04-17 21:13:32.424638893 +0000 UTC m=+23.626131744" observedRunningTime="2026-04-17 21:13:37.53034085 +0000 UTC m=+28.731833725" watchObservedRunningTime="2026-04-17 21:13:37.531085672 +0000 UTC m=+28.732578548" Apr 17 21:13:38.379842 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:38.379807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:38.380046 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:38.379952 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:38.507952 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:38.507896 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:38.507952 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:38.507935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:38.531674 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:38.530562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:13:39.192436 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:39.192403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-txkcf"] Apr 17 21:13:39.192581 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:39.192499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:39.192623 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:39.192579 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:39.198496 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:39.197137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rcbth"] Apr 17 21:13:39.198496 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:39.197177 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2mm6m"] Apr 17 21:13:39.198496 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:39.197286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:39.198496 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:39.197426 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:39.198496 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:39.197534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:39.198496 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:39.197668 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:40.379567 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:40.379533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:40.380068 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:40.379674 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:41.379823 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:41.379751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:41.380176 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:41.379751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:41.380176 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:41.379858 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:41.380176 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:41.379911 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:42.380241 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:42.380022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:42.380686 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:42.380340 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:43.009444 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:43.009410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:43.009623 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.009535 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:43.009623 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.009602 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:15.009583236 +0000 UTC m=+66.211076091 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:13:43.109844 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:43.109803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:43.109999 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.109928 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:13:43.109999 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.109947 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:13:43.109999 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.109956 2576 projected.go:194] Error preparing data for projected volume kube-api-access-r2tjk for pod openshift-network-diagnostics/network-check-target-2mm6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:43.110124 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.110009 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk podName:7c2a9506-e348-4ac6-930a-2264e8f5db1f nodeName:}" failed. No retries permitted until 2026-04-17 21:14:15.109990742 +0000 UTC m=+66.311483597 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-r2tjk" (UniqueName: "kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk") pod "network-check-target-2mm6m" (UID: "7c2a9506-e348-4ac6-930a-2264e8f5db1f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:13:43.379945 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:43.379918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:43.380087 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:43.379950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:43.380087 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.380021 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2mm6m" podUID="7c2a9506-e348-4ac6-930a-2264e8f5db1f" Apr 17 21:13:43.380193 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:43.380132 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-txkcf" podUID="2f888011-8182-4d16-8a26-05b9e1670eb0" Apr 17 21:13:44.379847 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:44.379819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:44.380333 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:44.379934 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:13:45.132035 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.131965 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-36.ec2.internal" event="NodeReady" Apr 17 21:13:45.132193 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.132101 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 21:13:45.164212 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.164182 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f"] Apr 17 21:13:45.176342 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.176315 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj"] Apr 17 21:13:45.176477 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.176464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.179307 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.179011 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 21:13:45.179307 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.179142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 21:13:45.179307 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.179149 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 21:13:45.179307 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.179242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 21:13:45.192418 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.192388 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v"] Apr 17 21:13:45.192568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.192534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.194910 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.194888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9hqbl\"" Apr 17 21:13:45.195032 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.194930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 21:13:45.217543 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.217519 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-dd99ff88-7xx7r"] Apr 17 21:13:45.217709 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.217678 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.219932 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.219911 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 21:13:45.220054 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.219917 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 21:13:45.220054 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.219950 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vzxvc\"" Apr 17 21:13:45.235771 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.235740 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t"] Apr 17 21:13:45.235892 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.235773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.238114 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.238096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 21:13:45.238290 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.238274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 21:13:45.238386 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.238371 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 21:13:45.238438 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.238386 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxtwb\"" Apr 17 21:13:45.243291 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.243273 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 21:13:45.260534 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260509 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f"] Apr 17 21:13:45.260534 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj"] Apr 17 21:13:45.260742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260549 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v"] Apr 17 21:13:45.260742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t"] Apr 17 21:13:45.260742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260566 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-dd99ff88-7xx7r"] Apr 17 21:13:45.260742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260671 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4vxtr"] Apr 17 21:13:45.260742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.260690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.262889 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.262866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 21:13:45.262998 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.262910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 21:13:45.262998 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.262869 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 21:13:45.262998 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.262987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 21:13:45.281156 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.281132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4vxtr"] Apr 17 21:13:45.281283 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.281264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.283299 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.283276 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 21:13:45.283403 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.283342 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-k464c\"" Apr 17 21:13:45.283490 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.283476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 21:13:45.290995 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.290974 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mnf4b"] Apr 17 21:13:45.301540 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.301520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:45.301692 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.301675 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mnf4b"] Apr 17 21:13:45.303625 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.303600 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 21:13:45.303625 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.303614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ggwsl\"" Apr 17 21:13:45.303883 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.303865 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 21:13:45.303943 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.303886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 21:13:45.323467 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcqw\" (UniqueName: \"kubernetes.io/projected/ab1472ee-a93b-4272-ae39-dcbf6ad20584-kube-api-access-wlcqw\") pod \"managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj\" (UID: \"ab1472ee-a93b-4272-ae39-dcbf6ad20584\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.323577 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-image-registry-private-configuration\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323577 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrgw8\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-kube-api-access-nrgw8\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323577 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg72h\" (UniqueName: \"kubernetes.io/projected/2d8aa775-1d70-4665-ae32-c33ec28862bf-kube-api-access-mg72h\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.323577 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323577 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-bound-sa-token\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2d8aa775-1d70-4665-ae32-c33ec28862bf-klusterlet-config\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.323758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-installation-pull-secrets\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.323758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ab1472ee-a93b-4272-ae39-dcbf6ad20584-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj\" (UID: \"ab1472ee-a93b-4272-ae39-dcbf6ad20584\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.323758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.323758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d8aa775-1d70-4665-ae32-c33ec28862bf-tmp\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.323931 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d28e489-a207-4cf7-9115-97fa588ae515-ca-trust-extracted\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323931 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-trusted-ca\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.323931 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.323863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-registry-certificates\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.379388 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.379358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:45.379643 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.379358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:13:45.382091 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.382031 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgbn6\"" Apr 17 21:13:45.382091 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.382035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:13:45.382091 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.382076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:13:45.382569 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.382063 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 21:13:45.424569 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-registry-certificates\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.424767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8487d59d-e14f-48c8-bfd1-045074db5610-tmp-dir\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.424767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-image-registry-private-configuration\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.424767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrgw8\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-kube-api-access-nrgw8\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.424767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.424767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-hub\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.424767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8487d59d-e14f-48c8-bfd1-045074db5610-config-volume\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sshp\" (UniqueName: \"kubernetes.io/projected/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-kube-api-access-7sshp\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d8aa775-1d70-4665-ae32-c33ec28862bf-tmp\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.424979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d28e489-a207-4cf7-9115-97fa588ae515-ca-trust-extracted\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-trusted-ca\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcqw\" (UniqueName: \"kubernetes.io/projected/ab1472ee-a93b-4272-ae39-dcbf6ad20584-kube-api-access-wlcqw\") pod \"managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj\" (UID: \"ab1472ee-a93b-4272-ae39-dcbf6ad20584\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg72h\" (UniqueName: \"kubernetes.io/projected/2d8aa775-1d70-4665-ae32-c33ec28862bf-kube-api-access-mg72h\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-bound-sa-token\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-ca\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2d8aa775-1d70-4665-ae32-c33ec28862bf-klusterlet-config\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/138036f1-c67f-4b4f-b0b9-95346997ca6e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-installation-pull-secrets\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.425266 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.425285 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d28e489-a207-4cf7-9115-97fa588ae515-ca-trust-extracted\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ab1472ee-a93b-4272-ae39-dcbf6ad20584-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj\" (UID: \"ab1472ee-a93b-4272-ae39-dcbf6ad20584\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-registry-certificates\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.425339 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:45.925320836 +0000 UTC m=+37.126813692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.425758 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.425626 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:13:45.426337 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.425831 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:45.925813528 +0000 UTC m=+37.127306397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:13:45.426337 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5wm\" (UniqueName: \"kubernetes.io/projected/138036f1-c67f-4b4f-b0b9-95346997ca6e-kube-api-access-gs5wm\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.426337 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h67tw\" (UniqueName: \"kubernetes.io/projected/8487d59d-e14f-48c8-bfd1-045074db5610-kube-api-access-h67tw\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.426337 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.425290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d8aa775-1d70-4665-ae32-c33ec28862bf-tmp\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.426337 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.426014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-trusted-ca\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.429118 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.429094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-installation-pull-secrets\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.429240 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.429219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-image-registry-private-configuration\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.429240 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.429232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2d8aa775-1d70-4665-ae32-c33ec28862bf-klusterlet-config\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.429326 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.429244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ab1472ee-a93b-4272-ae39-dcbf6ad20584-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj\" (UID: \"ab1472ee-a93b-4272-ae39-dcbf6ad20584\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.436153 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.436126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg72h\" (UniqueName: \"kubernetes.io/projected/2d8aa775-1d70-4665-ae32-c33ec28862bf-kube-api-access-mg72h\") pod \"klusterlet-addon-workmgr-75cc66ccb4-4sj2f\" (UID: \"2d8aa775-1d70-4665-ae32-c33ec28862bf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.436529 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.436505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcqw\" (UniqueName: \"kubernetes.io/projected/ab1472ee-a93b-4272-ae39-dcbf6ad20584-kube-api-access-wlcqw\") pod \"managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj\" (UID: \"ab1472ee-a93b-4272-ae39-dcbf6ad20584\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.436766 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.436750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrgw8\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-kube-api-access-nrgw8\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.437033 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.437012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-bound-sa-token\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.492071 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.492039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:45.510094 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.510061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" Apr 17 21:13:45.522864 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.522834 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8698539-048e-4326-92f2-2a5997c36c34" containerID="b9126cbef31972c6b82ad0251f538d0269ebc99981599ec95200ebd90aae0e8f" exitCode=0 Apr 17 21:13:45.523008 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.522897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerDied","Data":"b9126cbef31972c6b82ad0251f538d0269ebc99981599ec95200ebd90aae0e8f"} Apr 17 21:13:45.526433 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.526545 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-ca\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.526545 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/138036f1-c67f-4b4f-b0b9-95346997ca6e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.526545 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5wm\" (UniqueName: \"kubernetes.io/projected/138036f1-c67f-4b4f-b0b9-95346997ca6e-kube-api-access-gs5wm\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.526737 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h67tw\" (UniqueName: \"kubernetes.io/projected/8487d59d-e14f-48c8-bfd1-045074db5610-kube-api-access-h67tw\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.526861 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8487d59d-e14f-48c8-bfd1-045074db5610-tmp-dir\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.526925 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.526981 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-hub\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.526981 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.527083 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.526992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8487d59d-e14f-48c8-bfd1-045074db5610-config-volume\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.527083 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.527019 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:45.527177 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.527086 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:46.027066326 +0000 UTC m=+37.228559179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:13:45.527177 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.527088 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:45.527177 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.527121 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:46.027111291 +0000 UTC m=+37.228604143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:13:45.527177 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.527023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:45.527177 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.527145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8487d59d-e14f-48c8-bfd1-045074db5610-tmp-dir\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.527177 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.527157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sshp\" (UniqueName: \"kubernetes.io/projected/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-kube-api-access-7sshp\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:45.527484 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.527461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/138036f1-c67f-4b4f-b0b9-95346997ca6e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.527735 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.527706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8487d59d-e14f-48c8-bfd1-045074db5610-config-volume\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.529946 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.529896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.530312 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.530266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-hub\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.531273 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.530833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-ca\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.531273 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.531235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/138036f1-c67f-4b4f-b0b9-95346997ca6e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.534349 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.534150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h67tw\" (UniqueName: \"kubernetes.io/projected/8487d59d-e14f-48c8-bfd1-045074db5610-kube-api-access-h67tw\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:45.534790 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.534771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sshp\" (UniqueName: \"kubernetes.io/projected/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-kube-api-access-7sshp\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:45.534997 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.534977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5wm\" (UniqueName: \"kubernetes.io/projected/138036f1-c67f-4b4f-b0b9-95346997ca6e-kube-api-access-gs5wm\") pod \"cluster-proxy-proxy-agent-57776bc698-chk6t\" (UID: \"138036f1-c67f-4b4f-b0b9-95346997ca6e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.569880 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.569855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:13:45.679287 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.679008 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj"] Apr 17 21:13:45.680203 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.680176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f"] Apr 17 21:13:45.681678 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:45.681628 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab1472ee_a93b_4272_ae39_dcbf6ad20584.slice/crio-fca757c2bbe7c4d9d24e4e928a68bb5f2058e4a62dd0302e0a4acbb4d3a77177 WatchSource:0}: Error finding container fca757c2bbe7c4d9d24e4e928a68bb5f2058e4a62dd0302e0a4acbb4d3a77177: Status 404 returned error can't find the container with id fca757c2bbe7c4d9d24e4e928a68bb5f2058e4a62dd0302e0a4acbb4d3a77177 Apr 17 21:13:45.682557 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:45.682529 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8aa775_1d70_4665_ae32_c33ec28862bf.slice/crio-c302635d2ed3b8c95b4f56892a5d7bb4d083fdddcb0eb3802476b037b3580f1e WatchSource:0}: Error finding container c302635d2ed3b8c95b4f56892a5d7bb4d083fdddcb0eb3802476b037b3580f1e: Status 404 returned error can't find the container with id c302635d2ed3b8c95b4f56892a5d7bb4d083fdddcb0eb3802476b037b3580f1e Apr 17 21:13:45.745957 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.745930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t"] Apr 17 21:13:45.750098 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:45.750074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod138036f1_c67f_4b4f_b0b9_95346997ca6e.slice/crio-31c128061fe4bfc1dbd2cfb823ae73a7051f97fe754b4365eeed3bba07ba3eea WatchSource:0}: Error finding container 31c128061fe4bfc1dbd2cfb823ae73a7051f97fe754b4365eeed3bba07ba3eea: Status 404 returned error can't find the container with id 31c128061fe4bfc1dbd2cfb823ae73a7051f97fe754b4365eeed3bba07ba3eea Apr 17 21:13:45.931500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.931407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:45.931500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:45.931462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:45.931725 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.931557 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:13:45.931725 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.931578 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:13:45.931725 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.931614 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:13:45.931725 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.931643 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:46.931629583 +0000 UTC m=+38.133122435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:13:45.931725 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:45.931689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:46.931678207 +0000 UTC m=+38.133171062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:13:46.032746 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.032712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:46.032914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.032764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:46.032914 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.032853 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:46.032914 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.032856 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:46.032914 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.032901 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:47.032888617 +0000 UTC m=+38.234381468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:13:46.032914 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.032914 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:47.032908148 +0000 UTC m=+38.234401000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:13:46.379736 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.379699 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:13:46.382002 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.381976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:13:46.382105 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.382071 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zzp4n\"" Apr 17 21:13:46.526049 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.526014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" event={"ID":"2d8aa775-1d70-4665-ae32-c33ec28862bf","Type":"ContainerStarted","Data":"c302635d2ed3b8c95b4f56892a5d7bb4d083fdddcb0eb3802476b037b3580f1e"} Apr 17 21:13:46.528599 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.528565 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8698539-048e-4326-92f2-2a5997c36c34" containerID="207e71b791fd9dfd9184f0c821baeceeca14853c6b6a752a193039a99e2d0c4f" exitCode=0 Apr 17 21:13:46.528742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.528637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerDied","Data":"207e71b791fd9dfd9184f0c821baeceeca14853c6b6a752a193039a99e2d0c4f"} Apr 17 21:13:46.529710 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.529685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" event={"ID":"ab1472ee-a93b-4272-ae39-dcbf6ad20584","Type":"ContainerStarted","Data":"fca757c2bbe7c4d9d24e4e928a68bb5f2058e4a62dd0302e0a4acbb4d3a77177"} Apr 17 21:13:46.530889 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.530844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" event={"ID":"138036f1-c67f-4b4f-b0b9-95346997ca6e","Type":"ContainerStarted","Data":"31c128061fe4bfc1dbd2cfb823ae73a7051f97fe754b4365eeed3bba07ba3eea"} Apr 17 21:13:46.940248 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.940158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:46.940248 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:46.940221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:46.940521 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.940430 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:13:46.940521 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.940506 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:48.940478942 +0000 UTC m=+40.141971795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:13:46.940877 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.940760 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:13:46.940877 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.940780 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:13:46.940877 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:46.940827 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:48.940811408 +0000 UTC m=+40.142304262 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:13:47.041402 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:47.041362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:47.041567 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:47.041443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:47.041993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:47.041792 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:47.041993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:47.041862 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:49.041841355 +0000 UTC m=+40.243334211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:13:47.041993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:47.041617 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:47.041993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:47.041941 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:49.041928241 +0000 UTC m=+40.243421106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:13:47.540303 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:47.539404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" event={"ID":"c8698539-048e-4326-92f2-2a5997c36c34","Type":"ContainerStarted","Data":"66288d0c0ade0dc25ccb8938a9709d6fbd2504a878916705c216b1e5588402d1"} Apr 17 21:13:47.562166 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:47.562105 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fnlrt" podStartSLOduration=6.006022439 podStartE2EDuration="38.562084401s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:13:12.016939987 +0000 UTC m=+3.218432838" lastFinishedPulling="2026-04-17 21:13:44.573001948 +0000 UTC m=+35.774494800" observedRunningTime="2026-04-17 21:13:47.560085455 +0000 UTC m=+38.761578330" watchObservedRunningTime="2026-04-17 21:13:47.562084401 +0000 UTC m=+38.763577255" Apr 17 21:13:48.859184 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:48.859138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:48.864435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:48.864289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f888011-8182-4d16-8a26-05b9e1670eb0-original-pull-secret\") pod \"global-pull-secret-syncer-txkcf\" (UID: \"2f888011-8182-4d16-8a26-05b9e1670eb0\") " pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:48.959804 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:48.959629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:48.959804 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:48.959708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:48.959804 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:48.959776 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:13:48.959804 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:48.959800 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:13:48.960171 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:48.959862 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:52.959842477 +0000 UTC m=+44.161335354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:13:48.960171 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:48.959884 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:13:48.960171 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:48.959934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:52.959919682 +0000 UTC m=+44.161412534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:13:48.989884 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:48.989818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-txkcf" Apr 17 21:13:49.060663 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:49.060616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:49.060822 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:49.060756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:49.060822 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:49.060773 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:49.060945 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:49.060852 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:53.060831847 +0000 UTC m=+44.262324700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:13:49.060945 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:49.060889 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:49.060945 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:49.060942 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:13:53.060926296 +0000 UTC m=+44.262419163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:13:52.024577 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.024355 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-txkcf"] Apr 17 21:13:52.035190 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:13:52.035162 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f888011_8182_4d16_8a26_05b9e1670eb0.slice/crio-be0e80b7d3b0b86c4c6690eb84242e7766dab9dd264b31069890507882955bac WatchSource:0}: Error finding container be0e80b7d3b0b86c4c6690eb84242e7766dab9dd264b31069890507882955bac: Status 404 returned error can't find the container with id be0e80b7d3b0b86c4c6690eb84242e7766dab9dd264b31069890507882955bac Apr 17 21:13:52.550799 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.550759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" event={"ID":"138036f1-c67f-4b4f-b0b9-95346997ca6e","Type":"ContainerStarted","Data":"1c76bcfc0e87744f85a5fec7ac88a7095cb68d348314bbe4ccb53291129dfaee"} Apr 17 21:13:52.551988 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.551952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" event={"ID":"2d8aa775-1d70-4665-ae32-c33ec28862bf","Type":"ContainerStarted","Data":"04b2ffba98c94b70249777814e0ca860233371d452aac74e2fa600add6f0694d"} Apr 17 21:13:52.552186 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.552156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:52.553104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.553082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-txkcf" event={"ID":"2f888011-8182-4d16-8a26-05b9e1670eb0","Type":"ContainerStarted","Data":"be0e80b7d3b0b86c4c6690eb84242e7766dab9dd264b31069890507882955bac"} Apr 17 21:13:52.554012 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.553984 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:13:52.554393 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.554375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" event={"ID":"ab1472ee-a93b-4272-ae39-dcbf6ad20584","Type":"ContainerStarted","Data":"f46a2ffa7205cb3f9e7657c96f9d33a93c71841ab09ef244be12cf02b3754fa1"} Apr 17 21:13:52.566864 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.566829 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" podStartSLOduration=7.325004178 podStartE2EDuration="13.566815217s" podCreationTimestamp="2026-04-17 21:13:39 +0000 UTC" firstStartedPulling="2026-04-17 21:13:45.686093696 +0000 UTC m=+36.887586563" lastFinishedPulling="2026-04-17 21:13:51.927904751 +0000 UTC m=+43.129397602" observedRunningTime="2026-04-17 21:13:52.56642297 +0000 UTC m=+43.767915844" watchObservedRunningTime="2026-04-17 21:13:52.566815217 +0000 UTC m=+43.768308091" Apr 17 21:13:52.579471 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.579431 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" podStartSLOduration=7.356708502 podStartE2EDuration="13.579420858s" podCreationTimestamp="2026-04-17 21:13:39 +0000 UTC" firstStartedPulling="2026-04-17 21:13:45.68626056 +0000 UTC m=+36.887753440" lastFinishedPulling="2026-04-17 21:13:51.908972942 +0000 UTC m=+43.110465796" observedRunningTime="2026-04-17 21:13:52.578644572 +0000 UTC m=+43.780137445" watchObservedRunningTime="2026-04-17 21:13:52.579420858 +0000 UTC m=+43.780913731" Apr 17 21:13:52.991418 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.991380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:13:52.991418 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:52.991426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:13:52.991624 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:52.991527 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:13:52.991624 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:52.991545 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:13:52.991624 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:52.991544 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:13:52.991624 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:52.991602 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:00.991584409 +0000 UTC m=+52.193077260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:13:52.991624 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:52.991617 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:00.991611285 +0000 UTC m=+52.193104137 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:13:53.092470 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:53.092432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:13:53.092863 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:53.092504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:13:53.092863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:53.092605 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:13:53.092863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:53.092604 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:13:53.092863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:53.092687 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:01.092666343 +0000 UTC m=+52.294159209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:13:53.092863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:13:53.092714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:01.092700696 +0000 UTC m=+52.294193552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:13:58.567494 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:58.567450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-txkcf" event={"ID":"2f888011-8182-4d16-8a26-05b9e1670eb0","Type":"ContainerStarted","Data":"2dff42af8c5c32e70da515023aac30d5fcc0b56e3b8530be065d98aba7c32d33"} Apr 17 21:13:58.581776 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:13:58.581723 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-txkcf" podStartSLOduration=37.021247944 podStartE2EDuration="42.581707886s" podCreationTimestamp="2026-04-17 21:13:16 +0000 UTC" firstStartedPulling="2026-04-17 21:13:52.037205255 +0000 UTC m=+43.238698107" lastFinishedPulling="2026-04-17 21:13:57.597665196 +0000 UTC m=+48.799158049" observedRunningTime="2026-04-17 21:13:58.580740621 +0000 UTC m=+49.782233494" watchObservedRunningTime="2026-04-17 21:13:58.581707886 +0000 UTC m=+49.783200760" Apr 17 21:14:01.048137 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:01.048093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:14:01.048705 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:01.048231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:14:01.048705 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.048253 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:14:01.048705 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.048312 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:14:01.048705 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.048324 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:14:01.048705 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.048324 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:17.048304492 +0000 UTC m=+68.249797365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:14:01.048705 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.048364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:17.048352867 +0000 UTC m=+68.249845731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:14:01.149018 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:01.148971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:14:01.149201 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.149131 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:01.149201 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:01.149148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:14:01.149295 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.149206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:17.149185165 +0000 UTC m=+68.350678028 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:14:01.149295 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.149236 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:01.149295 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:01.149288 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:17.149273247 +0000 UTC m=+68.350766102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:14:03.582256 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:03.582214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" event={"ID":"138036f1-c67f-4b4f-b0b9-95346997ca6e","Type":"ContainerStarted","Data":"8ae1218bb13248b8462759e6c0968ac1a4f8d6d8ac2f3e26b594280dee72a850"} Apr 17 21:14:03.582256 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:03.582259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" event={"ID":"138036f1-c67f-4b4f-b0b9-95346997ca6e","Type":"ContainerStarted","Data":"ca44a7f60d0668db7b40eb769f3a9f763ea3af2049d7cf48e556689e9b061a3d"} Apr 17 21:14:03.601738 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:03.600835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" podStartSLOduration=7.483582152 podStartE2EDuration="24.600816688s" podCreationTimestamp="2026-04-17 21:13:39 +0000 UTC" firstStartedPulling="2026-04-17 21:13:45.75193211 +0000 UTC m=+36.953424966" lastFinishedPulling="2026-04-17 21:14:02.869166636 +0000 UTC m=+54.070659502" observedRunningTime="2026-04-17 21:14:03.600066372 +0000 UTC m=+54.801559246" watchObservedRunningTime="2026-04-17 21:14:03.600816688 +0000 UTC m=+54.802309563" Apr 17 21:14:10.521860 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:10.521830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wvwmf" Apr 17 21:14:15.059586 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.059535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:14:15.061731 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.061711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:14:15.070399 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:15.070382 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:14:15.070443 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:15.070434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:19.070419341 +0000 UTC m=+130.271912193 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : secret "metrics-daemon-secret" not found Apr 17 21:14:15.160302 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.160264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:14:15.162547 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.162530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:14:15.173285 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.173266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:14:15.185455 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.185426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2tjk\" (UniqueName: \"kubernetes.io/projected/7c2a9506-e348-4ac6-930a-2264e8f5db1f-kube-api-access-r2tjk\") pod \"network-check-target-2mm6m\" (UID: \"7c2a9506-e348-4ac6-930a-2264e8f5db1f\") " pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:14:15.396083 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.396054 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgbn6\"" Apr 17 21:14:15.403861 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.403834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:14:15.512792 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.512763 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2mm6m"] Apr 17 21:14:15.515973 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:14:15.515944 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c2a9506_e348_4ac6_930a_2264e8f5db1f.slice/crio-b33d02e52df2ad3aeb6ca3afcec04e7d1b5909ae067f3836416342fc1c190c88 WatchSource:0}: Error finding container b33d02e52df2ad3aeb6ca3afcec04e7d1b5909ae067f3836416342fc1c190c88: Status 404 returned error can't find the container with id b33d02e52df2ad3aeb6ca3afcec04e7d1b5909ae067f3836416342fc1c190c88 Apr 17 21:14:15.613536 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:15.613497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2mm6m" event={"ID":"7c2a9506-e348-4ac6-930a-2264e8f5db1f","Type":"ContainerStarted","Data":"b33d02e52df2ad3aeb6ca3afcec04e7d1b5909ae067f3836416342fc1c190c88"} Apr 17 21:14:17.075443 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:17.075388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:14:17.075443 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:17.075448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:14:17.075993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.075532 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:14:17.075993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.075554 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:14:17.075993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.075558 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:14:17.075993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.075606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:49.075590723 +0000 UTC m=+100.277083580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:14:17.075993 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.075622 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:49.075615768 +0000 UTC m=+100.277108620 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:14:17.176643 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:17.176608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:14:17.176818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:17.176753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:14:17.176818 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.176762 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:17.176909 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.176847 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:49.176830272 +0000 UTC m=+100.378323124 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:14:17.176909 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.176885 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:17.176979 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:17.176936 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:49.176921487 +0000 UTC m=+100.378414344 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:14:19.628181 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:19.628146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2mm6m" event={"ID":"7c2a9506-e348-4ac6-930a-2264e8f5db1f","Type":"ContainerStarted","Data":"48ad42e3ca01278eabf30a584a30e23b95d7e72629286aceb7d63a48e153e05e"} Apr 17 21:14:19.628693 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:19.628271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:14:19.642551 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:19.642500 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2mm6m" podStartSLOduration=67.056103563 podStartE2EDuration="1m10.642487316s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:14:15.517845342 +0000 UTC m=+66.719338195" lastFinishedPulling="2026-04-17 21:14:19.104229094 +0000 UTC m=+70.305721948" observedRunningTime="2026-04-17 21:14:19.641947994 +0000 UTC m=+70.843440894" watchObservedRunningTime="2026-04-17 21:14:19.642487316 +0000 UTC m=+70.843980189" Apr 17 21:14:49.131227 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:49.131187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:14:49.131630 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:49.131239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:14:49.131630 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.131347 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:14:49.131630 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.131369 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dd99ff88-7xx7r: secret "image-registry-tls" not found Apr 17 21:14:49.131630 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.131428 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls podName:5d28e489-a207-4cf7-9115-97fa588ae515 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:53.131413861 +0000 UTC m=+164.332906713 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls") pod "image-registry-dd99ff88-7xx7r" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515") : secret "image-registry-tls" not found Apr 17 21:14:49.131630 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.131356 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:14:49.131630 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.131483 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:53.131471857 +0000 UTC m=+164.332964708 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:14:49.231990 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:49.231957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:14:49.232086 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:49.232055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:14:49.232136 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.232093 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:49.232171 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.232146 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:49.232208 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.232147 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert podName:99e3f7d8-5975-4eb1-af99-b5c1cd2c0333 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:53.232133945 +0000 UTC m=+164.433626796 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert") pod "ingress-canary-mnf4b" (UID: "99e3f7d8-5975-4eb1-af99-b5c1cd2c0333") : secret "canary-serving-cert" not found Apr 17 21:14:49.232208 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:14:49.232194 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls podName:8487d59d-e14f-48c8-bfd1-045074db5610 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:53.232181351 +0000 UTC m=+164.433674203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls") pod "dns-default-4vxtr" (UID: "8487d59d-e14f-48c8-bfd1-045074db5610") : secret "dns-default-metrics-tls" not found Apr 17 21:14:50.633302 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:14:50.633269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2mm6m" Apr 17 21:15:19.160850 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:19.160782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:15:19.161438 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:19.160955 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:15:19.161438 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:19.161054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs podName:8a39439a-b5a0-4399-975e-838c219449b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:17:21.161032082 +0000 UTC m=+252.362524934 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs") pod "network-metrics-daemon-rcbth" (UID: "8a39439a-b5a0-4399-975e-838c219449b7") : secret "metrics-daemon-secret" not found Apr 17 21:15:35.804558 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:35.804525 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dj96t_73df3c3b-340e-459f-a30d-51085c37c69b/dns-node-resolver/0.log" Apr 17 21:15:36.603730 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:36.603704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8bvhr_becd6e32-bdde-40bf-bef7-c2f14ff29b2b/node-ca/0.log" Apr 17 21:15:48.226063 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:48.226001 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" podUID="f81f53a6-485f-4654-b5dd-6b5d5f5784c8" Apr 17 21:15:48.245259 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:48.245219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" Apr 17 21:15:48.299075 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:48.299034 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4vxtr" podUID="8487d59d-e14f-48c8-bfd1-045074db5610" Apr 17 21:15:48.313206 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:48.313172 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mnf4b" podUID="99e3f7d8-5975-4eb1-af99-b5c1cd2c0333" Apr 17 21:15:48.834553 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:48.834508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:15:48.834553 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:48.834530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:15:48.834812 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:48.834562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:15:49.388547 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:49.388499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rcbth" podUID="8a39439a-b5a0-4399-975e-838c219449b7" Apr 17 21:15:52.552636 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.552512 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" podUID="2d8aa775-1d70-4665-ae32-c33ec28862bf" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.6:8000/readyz\": dial tcp 10.132.0.6:8000: connect: connection refused" Apr 17 21:15:52.844880 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.844848 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d8aa775-1d70-4665-ae32-c33ec28862bf" containerID="04b2ffba98c94b70249777814e0ca860233371d452aac74e2fa600add6f0694d" exitCode=1 Apr 17 21:15:52.845055 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.844918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" event={"ID":"2d8aa775-1d70-4665-ae32-c33ec28862bf","Type":"ContainerDied","Data":"04b2ffba98c94b70249777814e0ca860233371d452aac74e2fa600add6f0694d"} Apr 17 21:15:52.845304 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.845287 2576 scope.go:117] "RemoveContainer" containerID="04b2ffba98c94b70249777814e0ca860233371d452aac74e2fa600add6f0694d" Apr 17 21:15:52.846257 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.846235 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab1472ee-a93b-4272-ae39-dcbf6ad20584" containerID="f46a2ffa7205cb3f9e7657c96f9d33a93c71841ab09ef244be12cf02b3754fa1" exitCode=255 Apr 17 21:15:52.846356 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.846279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" event={"ID":"ab1472ee-a93b-4272-ae39-dcbf6ad20584","Type":"ContainerDied","Data":"f46a2ffa7205cb3f9e7657c96f9d33a93c71841ab09ef244be12cf02b3754fa1"} Apr 17 21:15:52.846690 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:52.846675 2576 scope.go:117] "RemoveContainer" containerID="f46a2ffa7205cb3f9e7657c96f9d33a93c71841ab09ef244be12cf02b3754fa1" Apr 17 21:15:53.227181 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.227077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:15:53.227181 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.227133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:15:53.227384 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:53.227231 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 21:15:53.227384 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:15:53.227295 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert podName:f81f53a6-485f-4654-b5dd-6b5d5f5784c8 nodeName:}" failed. No retries permitted until 2026-04-17 21:17:55.227280044 +0000 UTC m=+286.428772896 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zsm2v" (UID: "f81f53a6-485f-4654-b5dd-6b5d5f5784c8") : secret "networking-console-plugin-cert" not found Apr 17 21:15:53.229472 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.229447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"image-registry-dd99ff88-7xx7r\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:15:53.327787 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.327744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:15:53.327787 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.327793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:15:53.330210 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.330182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8487d59d-e14f-48c8-bfd1-045074db5610-metrics-tls\") pod \"dns-default-4vxtr\" (UID: \"8487d59d-e14f-48c8-bfd1-045074db5610\") " pod="openshift-dns/dns-default-4vxtr" Apr 17 21:15:53.330327 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.330218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99e3f7d8-5975-4eb1-af99-b5c1cd2c0333-cert\") pod \"ingress-canary-mnf4b\" (UID: \"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333\") " pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:15:53.337291 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.337263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ggwsl\"" Apr 17 21:15:53.337291 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.337263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxtwb\"" Apr 17 21:15:53.346093 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.346050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:15:53.346255 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.346127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mnf4b" Apr 17 21:15:53.481076 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.480980 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mnf4b"] Apr 17 21:15:53.485127 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:15:53.485095 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e3f7d8_5975_4eb1_af99_b5c1cd2c0333.slice/crio-031c6072d625a6a962b33a0f33d34a5e5c9f7245dd09e411d9712d46a9ed0f1c WatchSource:0}: Error finding container 031c6072d625a6a962b33a0f33d34a5e5c9f7245dd09e411d9712d46a9ed0f1c: Status 404 returned error can't find the container with id 031c6072d625a6a962b33a0f33d34a5e5c9f7245dd09e411d9712d46a9ed0f1c Apr 17 21:15:53.497823 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.497795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-dd99ff88-7xx7r"] Apr 17 21:15:53.501075 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:15:53.501006 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d28e489_a207_4cf7_9115_97fa588ae515.slice/crio-c3fd6aac46623fe24f76028df83f8a9173d9cd40bd1364ef0e379350172ad2cc WatchSource:0}: Error finding container c3fd6aac46623fe24f76028df83f8a9173d9cd40bd1364ef0e379350172ad2cc: Status 404 returned error can't find the container with id c3fd6aac46623fe24f76028df83f8a9173d9cd40bd1364ef0e379350172ad2cc Apr 17 21:15:53.849489 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.849454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" event={"ID":"5d28e489-a207-4cf7-9115-97fa588ae515","Type":"ContainerStarted","Data":"14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f"} Apr 17 21:15:53.849952 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.849496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" event={"ID":"5d28e489-a207-4cf7-9115-97fa588ae515","Type":"ContainerStarted","Data":"c3fd6aac46623fe24f76028df83f8a9173d9cd40bd1364ef0e379350172ad2cc"} Apr 17 21:15:53.849952 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.849533 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:15:53.851152 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.851125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" event={"ID":"2d8aa775-1d70-4665-ae32-c33ec28862bf","Type":"ContainerStarted","Data":"df33897de110c47ff3c1dd6cfec259ae80e9b3656d65fb740c653da6a0b3f174"} Apr 17 21:15:53.851429 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.851405 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:15:53.852072 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.852054 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75cc66ccb4-4sj2f" Apr 17 21:15:53.852915 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.852894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5bdfb7649c-sdfbj" event={"ID":"ab1472ee-a93b-4272-ae39-dcbf6ad20584","Type":"ContainerStarted","Data":"92d0a34c28a59e79a279c6148e2a6169241a46fc7bd4e2b6b111be99920800f7"} Apr 17 21:15:53.853925 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.853902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mnf4b" event={"ID":"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333","Type":"ContainerStarted","Data":"031c6072d625a6a962b33a0f33d34a5e5c9f7245dd09e411d9712d46a9ed0f1c"} Apr 17 21:15:53.868179 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:53.868133 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" podStartSLOduration=164.868117792 podStartE2EDuration="2m44.868117792s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:15:53.866885619 +0000 UTC m=+165.068378507" watchObservedRunningTime="2026-04-17 21:15:53.868117792 +0000 UTC m=+165.069610666" Apr 17 21:15:55.860800 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:55.860750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mnf4b" event={"ID":"99e3f7d8-5975-4eb1-af99-b5c1cd2c0333","Type":"ContainerStarted","Data":"94984d2cebfa371f81a698bee10376eb4c0d321ab2df4858ff37fe8ed0aca0ed"} Apr 17 21:15:55.875586 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:15:55.875460 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mnf4b" podStartSLOduration=128.735869657 podStartE2EDuration="2m10.875446901s" podCreationTimestamp="2026-04-17 21:13:45 +0000 UTC" firstStartedPulling="2026-04-17 21:15:53.48686925 +0000 UTC m=+164.688362102" lastFinishedPulling="2026-04-17 21:15:55.626446482 +0000 UTC m=+166.827939346" observedRunningTime="2026-04-17 21:15:55.874516158 +0000 UTC m=+167.076009057" watchObservedRunningTime="2026-04-17 21:15:55.875446901 +0000 UTC m=+167.076939775" Apr 17 21:16:01.379843 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:01.379756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vxtr" Apr 17 21:16:01.382337 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:01.382319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-k464c\"" Apr 17 21:16:01.390211 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:01.390186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vxtr" Apr 17 21:16:01.505629 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:01.505597 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4vxtr"] Apr 17 21:16:01.508537 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:16:01.508508 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8487d59d_e14f_48c8_bfd1_045074db5610.slice/crio-dfe55cf8a4d5fdc4a42b17e4aaa6f92ba73708c0c46b8bbbef927c494df84dd4 WatchSource:0}: Error finding container dfe55cf8a4d5fdc4a42b17e4aaa6f92ba73708c0c46b8bbbef927c494df84dd4: Status 404 returned error can't find the container with id dfe55cf8a4d5fdc4a42b17e4aaa6f92ba73708c0c46b8bbbef927c494df84dd4 Apr 17 21:16:01.877453 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:01.877411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vxtr" event={"ID":"8487d59d-e14f-48c8-bfd1-045074db5610","Type":"ContainerStarted","Data":"dfe55cf8a4d5fdc4a42b17e4aaa6f92ba73708c0c46b8bbbef927c494df84dd4"} Apr 17 21:16:02.881240 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:02.881207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vxtr" event={"ID":"8487d59d-e14f-48c8-bfd1-045074db5610","Type":"ContainerStarted","Data":"537d5fa2fb969621a5eb9473e5c73ff9942af3b699b0c9a0f6c30409abc37a46"} Apr 17 21:16:02.881240 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:02.881241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vxtr" event={"ID":"8487d59d-e14f-48c8-bfd1-045074db5610","Type":"ContainerStarted","Data":"202be138cea115166c93d6803bb157fe38a277a98c3483aeed354f1b1707e80e"} Apr 17 21:16:02.881763 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:02.881322 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4vxtr" Apr 17 21:16:02.896255 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:02.896155 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4vxtr" podStartSLOduration=136.767712919 podStartE2EDuration="2m17.896141679s" podCreationTimestamp="2026-04-17 21:13:45 +0000 UTC" firstStartedPulling="2026-04-17 21:16:01.510381442 +0000 UTC m=+172.711874300" lastFinishedPulling="2026-04-17 21:16:02.638810204 +0000 UTC m=+173.840303060" observedRunningTime="2026-04-17 21:16:02.895150984 +0000 UTC m=+174.096643861" watchObservedRunningTime="2026-04-17 21:16:02.896141679 +0000 UTC m=+174.097634552" Apr 17 21:16:04.379674 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:04.379621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:16:05.829495 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.829455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cx5kv"] Apr 17 21:16:05.832639 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.832617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:05.834867 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.834846 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 21:16:05.835823 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.835805 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 21:16:05.835914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.835852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5zvnz\"" Apr 17 21:16:05.835914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.835866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 21:16:05.835914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.835899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 21:16:05.843349 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.843327 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cx5kv"] Apr 17 21:16:05.924766 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.924735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35f8b229-3626-445e-b9f4-d73400ef233a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:05.924941 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.924775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35f8b229-3626-445e-b9f4-d73400ef233a-crio-socket\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:05.924941 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.924795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bh5\" (UniqueName: \"kubernetes.io/projected/35f8b229-3626-445e-b9f4-d73400ef233a-kube-api-access-f7bh5\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:05.924941 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.924878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35f8b229-3626-445e-b9f4-d73400ef233a-data-volume\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:05.924941 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:05.924899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35f8b229-3626-445e-b9f4-d73400ef233a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.025964 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.025934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35f8b229-3626-445e-b9f4-d73400ef233a-crio-socket\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.025964 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.025968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bh5\" (UniqueName: \"kubernetes.io/projected/35f8b229-3626-445e-b9f4-d73400ef233a-kube-api-access-f7bh5\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.026212 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.026069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35f8b229-3626-445e-b9f4-d73400ef233a-crio-socket\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.026212 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.026146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35f8b229-3626-445e-b9f4-d73400ef233a-data-volume\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.026212 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.026171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35f8b229-3626-445e-b9f4-d73400ef233a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.026319 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.026219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35f8b229-3626-445e-b9f4-d73400ef233a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.026561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.026542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35f8b229-3626-445e-b9f4-d73400ef233a-data-volume\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.026818 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.026803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35f8b229-3626-445e-b9f4-d73400ef233a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.028416 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.028400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35f8b229-3626-445e-b9f4-d73400ef233a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.033160 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.033139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bh5\" (UniqueName: \"kubernetes.io/projected/35f8b229-3626-445e-b9f4-d73400ef233a-kube-api-access-f7bh5\") pod \"insights-runtime-extractor-cx5kv\" (UID: \"35f8b229-3626-445e-b9f4-d73400ef233a\") " pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.141486 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.141444 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cx5kv" Apr 17 21:16:06.255261 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.255226 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cx5kv"] Apr 17 21:16:06.258237 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:16:06.258206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35f8b229_3626_445e_b9f4_d73400ef233a.slice/crio-05f19fdaf14feb3950a30a154d3a312672cb3b3e2a3b2929fa235accc8aa684d WatchSource:0}: Error finding container 05f19fdaf14feb3950a30a154d3a312672cb3b3e2a3b2929fa235accc8aa684d: Status 404 returned error can't find the container with id 05f19fdaf14feb3950a30a154d3a312672cb3b3e2a3b2929fa235accc8aa684d Apr 17 21:16:06.893827 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.893791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cx5kv" event={"ID":"35f8b229-3626-445e-b9f4-d73400ef233a","Type":"ContainerStarted","Data":"b7e0586baa47da6d7de85172a3b1ef610bca0b034bf651914a2d6c307358af02"} Apr 17 21:16:06.893827 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:06.893825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cx5kv" event={"ID":"35f8b229-3626-445e-b9f4-d73400ef233a","Type":"ContainerStarted","Data":"05f19fdaf14feb3950a30a154d3a312672cb3b3e2a3b2929fa235accc8aa684d"} Apr 17 21:16:07.898607 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:07.898572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cx5kv" event={"ID":"35f8b229-3626-445e-b9f4-d73400ef233a","Type":"ContainerStarted","Data":"fd0dd0dfd181f3493aa7d585d4de70edb30048a8240f01f99ad9166fd7f3f437"} Apr 17 21:16:08.908416 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:08.908382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cx5kv" event={"ID":"35f8b229-3626-445e-b9f4-d73400ef233a","Type":"ContainerStarted","Data":"3d1056b88a0f4c5f9325ff3f7f57e5176e13a733d29e4715131c33a3fd2c59fd"} Apr 17 21:16:08.925058 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:08.925014 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cx5kv" podStartSLOduration=1.476610937 podStartE2EDuration="3.924999772s" podCreationTimestamp="2026-04-17 21:16:05 +0000 UTC" firstStartedPulling="2026-04-17 21:16:06.316444772 +0000 UTC m=+177.517937625" lastFinishedPulling="2026-04-17 21:16:08.764833596 +0000 UTC m=+179.966326460" observedRunningTime="2026-04-17 21:16:08.923792916 +0000 UTC m=+180.125285804" watchObservedRunningTime="2026-04-17 21:16:08.924999772 +0000 UTC m=+180.126492646" Apr 17 21:16:12.885716 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.885677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4vxtr" Apr 17 21:16:12.914555 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.914519 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4gtnq"] Apr 17 21:16:12.918184 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.918156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.923728 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.923703 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 21:16:12.923870 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.923762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 21:16:12.923870 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.923819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 21:16:12.923870 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.923747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m8ph4\"" Apr 17 21:16:12.924030 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.923703 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 21:16:12.924203 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.924180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 21:16:12.924339 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.923743 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 21:16:12.978519 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.978486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c494d1b2-41ce-4d6d-9180-64145606ffcb-metrics-client-ca\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.978843 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.978819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-root\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.978972 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.978858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-sys\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.978972 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.978879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-tls\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.978972 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.978904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-accelerators-collector-config\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.979108 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.979003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-textfile\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.979210 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.979184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-wtmp\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.979485 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.979218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:12.979485 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:12.979271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldvp\" (UniqueName: \"kubernetes.io/projected/c494d1b2-41ce-4d6d-9180-64145606ffcb-kube-api-access-gldvp\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080328 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gldvp\" (UniqueName: \"kubernetes.io/projected/c494d1b2-41ce-4d6d-9180-64145606ffcb-kube-api-access-gldvp\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080328 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c494d1b2-41ce-4d6d-9180-64145606ffcb-metrics-client-ca\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-root\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-sys\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-tls\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-accelerators-collector-config\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-textfile\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-root\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-sys\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080556 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-wtmp\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080973 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080973 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-wtmp\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.080973 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.080967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-textfile\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.081088 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.081051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c494d1b2-41ce-4d6d-9180-64145606ffcb-metrics-client-ca\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.081140 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.081121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-accelerators-collector-config\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.082937 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.082916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.083086 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.083072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c494d1b2-41ce-4d6d-9180-64145606ffcb-node-exporter-tls\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.088049 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.088026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldvp\" (UniqueName: \"kubernetes.io/projected/c494d1b2-41ce-4d6d-9180-64145606ffcb-kube-api-access-gldvp\") pod \"node-exporter-4gtnq\" (UID: \"c494d1b2-41ce-4d6d-9180-64145606ffcb\") " pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.232798 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.232709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4gtnq" Apr 17 21:16:13.242023 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:16:13.241991 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc494d1b2_41ce_4d6d_9180_64145606ffcb.slice/crio-ca9fbfd7939ecb31e78f088f5a22273addc435e4a6e47c636c33e5535482894d WatchSource:0}: Error finding container ca9fbfd7939ecb31e78f088f5a22273addc435e4a6e47c636c33e5535482894d: Status 404 returned error can't find the container with id ca9fbfd7939ecb31e78f088f5a22273addc435e4a6e47c636c33e5535482894d Apr 17 21:16:13.349984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.349926 2576 patch_prober.go:28] interesting pod/image-registry-dd99ff88-7xx7r container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 21:16:13.350132 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.350012 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 21:16:13.921760 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:13.921729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4gtnq" event={"ID":"c494d1b2-41ce-4d6d-9180-64145606ffcb","Type":"ContainerStarted","Data":"ca9fbfd7939ecb31e78f088f5a22273addc435e4a6e47c636c33e5535482894d"} Apr 17 21:16:14.860338 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:14.860299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:16:14.926148 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:14.926110 2576 generic.go:358] "Generic (PLEG): container finished" podID="c494d1b2-41ce-4d6d-9180-64145606ffcb" containerID="06bd0845f72f8f69d5d7a038aa945da43c8ec3d3fc552daa9d3b06f1552daed2" exitCode=0 Apr 17 21:16:14.926510 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:14.926154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4gtnq" event={"ID":"c494d1b2-41ce-4d6d-9180-64145606ffcb","Type":"ContainerDied","Data":"06bd0845f72f8f69d5d7a038aa945da43c8ec3d3fc552daa9d3b06f1552daed2"} Apr 17 21:16:15.930167 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:15.930130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4gtnq" event={"ID":"c494d1b2-41ce-4d6d-9180-64145606ffcb","Type":"ContainerStarted","Data":"c70b95f7ad312291faef4cef0a642303958b475ca32d850234402445b0246704"} Apr 17 21:16:15.930167 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:15.930172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4gtnq" event={"ID":"c494d1b2-41ce-4d6d-9180-64145606ffcb","Type":"ContainerStarted","Data":"194873ae06a03cd13c6f2ee6fe4ec9d03d7c874d36b3f8408b09f5e38b2df1bf"} Apr 17 21:16:15.948991 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:15.948943 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4gtnq" podStartSLOduration=3.070147453 podStartE2EDuration="3.948928919s" podCreationTimestamp="2026-04-17 21:16:12 +0000 UTC" firstStartedPulling="2026-04-17 21:16:13.243723593 +0000 UTC m=+184.445216452" lastFinishedPulling="2026-04-17 21:16:14.122505055 +0000 UTC m=+185.323997918" observedRunningTime="2026-04-17 21:16:15.94806264 +0000 UTC m=+187.149555514" watchObservedRunningTime="2026-04-17 21:16:15.948928919 +0000 UTC m=+187.150421792" Apr 17 21:16:27.473772 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:27.473723 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-dd99ff88-7xx7r"] Apr 17 21:16:35.571321 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:35.571275 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" podUID="138036f1-c67f-4b4f-b0b9-95346997ca6e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 21:16:39.014463 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:39.014429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mnf4b_99e3f7d8-5975-4eb1-af99-b5c1cd2c0333/serve-healthcheck-canary/0.log" Apr 17 21:16:45.570964 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:45.570924 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" podUID="138036f1-c67f-4b4f-b0b9-95346997ca6e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 21:16:52.496422 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.496363 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" containerName="registry" containerID="cri-o://14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f" gracePeriod=30 Apr 17 21:16:52.726514 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.726491 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:16:52.891790 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891757 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-trusted-ca\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.891790 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891805 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-bound-sa-token\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891835 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-registry-certificates\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d28e489-a207-4cf7-9115-97fa588ae515-ca-trust-extracted\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891923 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrgw8\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-kube-api-access-nrgw8\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891952 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-image-registry-private-configuration\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.891981 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-installation-pull-secrets\") pod \"5d28e489-a207-4cf7-9115-97fa588ae515\" (UID: \"5d28e489-a207-4cf7-9115-97fa588ae515\") " Apr 17 21:16:52.892401 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.892242 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:16:52.892533 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.892500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:16:52.894322 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.894289 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:16:52.894435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.894294 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:16:52.894500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.894460 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:16:52.894557 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.894525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:16:52.894688 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.894634 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-kube-api-access-nrgw8" (OuterVolumeSpecName: "kube-api-access-nrgw8") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "kube-api-access-nrgw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:16:52.900895 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.900860 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d28e489-a207-4cf7-9115-97fa588ae515-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5d28e489-a207-4cf7-9115-97fa588ae515" (UID: "5d28e489-a207-4cf7-9115-97fa588ae515"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:16:52.993309 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993270 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-bound-sa-token\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993309 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993300 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-registry-certificates\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993309 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993312 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-registry-tls\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993527 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993321 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d28e489-a207-4cf7-9115-97fa588ae515-ca-trust-extracted\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993527 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993330 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrgw8\" (UniqueName: \"kubernetes.io/projected/5d28e489-a207-4cf7-9115-97fa588ae515-kube-api-access-nrgw8\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993527 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993339 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-image-registry-private-configuration\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993527 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993349 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d28e489-a207-4cf7-9115-97fa588ae515-installation-pull-secrets\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:52.993527 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:52.993357 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d28e489-a207-4cf7-9115-97fa588ae515-trusted-ca\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:16:53.029615 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.029583 2576 generic.go:358] "Generic (PLEG): container finished" podID="5d28e489-a207-4cf7-9115-97fa588ae515" containerID="14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f" exitCode=0 Apr 17 21:16:53.029808 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.029689 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" Apr 17 21:16:53.029808 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.029686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" event={"ID":"5d28e489-a207-4cf7-9115-97fa588ae515","Type":"ContainerDied","Data":"14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f"} Apr 17 21:16:53.029808 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.029736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-dd99ff88-7xx7r" event={"ID":"5d28e489-a207-4cf7-9115-97fa588ae515","Type":"ContainerDied","Data":"c3fd6aac46623fe24f76028df83f8a9173d9cd40bd1364ef0e379350172ad2cc"} Apr 17 21:16:53.029808 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.029769 2576 scope.go:117] "RemoveContainer" containerID="14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f" Apr 17 21:16:53.038187 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.038164 2576 scope.go:117] "RemoveContainer" containerID="14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f" Apr 17 21:16:53.038495 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:16:53.038475 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f\": container with ID starting with 14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f not found: ID does not exist" containerID="14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f" Apr 17 21:16:53.038542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.038506 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f"} err="failed to get container status \"14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f\": rpc error: code = NotFound desc = could not find container \"14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f\": container with ID starting with 14f5a515c008c7887b4d96c944017e0d15c57c288e0e6cd0add7a2c0501c4b0f not found: ID does not exist" Apr 17 21:16:53.048850 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.048824 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-dd99ff88-7xx7r"] Apr 17 21:16:53.052385 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.052358 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-dd99ff88-7xx7r"] Apr 17 21:16:53.383365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:53.383330 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" path="/var/lib/kubelet/pods/5d28e489-a207-4cf7-9115-97fa588ae515/volumes" Apr 17 21:16:55.571709 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:55.571647 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" podUID="138036f1-c67f-4b4f-b0b9-95346997ca6e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 21:16:55.572079 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:55.571749 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" Apr 17 21:16:55.572217 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:55.572200 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8ae1218bb13248b8462759e6c0968ac1a4f8d6d8ac2f3e26b594280dee72a850"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 21:16:55.572254 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:55.572238 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" podUID="138036f1-c67f-4b4f-b0b9-95346997ca6e" containerName="service-proxy" containerID="cri-o://8ae1218bb13248b8462759e6c0968ac1a4f8d6d8ac2f3e26b594280dee72a850" gracePeriod=30 Apr 17 21:16:56.042223 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:56.042186 2576 generic.go:358] "Generic (PLEG): container finished" podID="138036f1-c67f-4b4f-b0b9-95346997ca6e" containerID="8ae1218bb13248b8462759e6c0968ac1a4f8d6d8ac2f3e26b594280dee72a850" exitCode=2 Apr 17 21:16:56.042403 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:56.042236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" event={"ID":"138036f1-c67f-4b4f-b0b9-95346997ca6e","Type":"ContainerDied","Data":"8ae1218bb13248b8462759e6c0968ac1a4f8d6d8ac2f3e26b594280dee72a850"} Apr 17 21:16:56.042403 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:16:56.042269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57776bc698-chk6t" event={"ID":"138036f1-c67f-4b4f-b0b9-95346997ca6e","Type":"ContainerStarted","Data":"d2288e56f05806aee4664034896b3dfb9ac01a34917a0784955476570b542c3f"} Apr 17 21:17:21.203506 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:21.203464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:17:21.205680 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:21.205640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a39439a-b5a0-4399-975e-838c219449b7-metrics-certs\") pod \"network-metrics-daemon-rcbth\" (UID: \"8a39439a-b5a0-4399-975e-838c219449b7\") " pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:17:21.482879 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:21.482802 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zzp4n\"" Apr 17 21:17:21.491531 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:21.491514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rcbth" Apr 17 21:17:21.638825 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:21.638795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rcbth"] Apr 17 21:17:21.641751 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:17:21.641722 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a39439a_b5a0_4399_975e_838c219449b7.slice/crio-d97f586674cc84c94c362b4490e2e3949ffa9d76f231e50c3ebe6c85166a7565 WatchSource:0}: Error finding container d97f586674cc84c94c362b4490e2e3949ffa9d76f231e50c3ebe6c85166a7565: Status 404 returned error can't find the container with id d97f586674cc84c94c362b4490e2e3949ffa9d76f231e50c3ebe6c85166a7565 Apr 17 21:17:22.108478 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:22.108441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rcbth" event={"ID":"8a39439a-b5a0-4399-975e-838c219449b7","Type":"ContainerStarted","Data":"d97f586674cc84c94c362b4490e2e3949ffa9d76f231e50c3ebe6c85166a7565"} Apr 17 21:17:23.112221 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:23.112183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rcbth" event={"ID":"8a39439a-b5a0-4399-975e-838c219449b7","Type":"ContainerStarted","Data":"f657549f61decd11308086ddc35f45bac5bd45cc3f1fab71753db08464eb79de"} Apr 17 21:17:23.112221 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:23.112224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rcbth" event={"ID":"8a39439a-b5a0-4399-975e-838c219449b7","Type":"ContainerStarted","Data":"f37ded499a9b19ca83ef3092d8972840cfe002b76c5d672430e890008d530970"} Apr 17 21:17:23.128154 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:23.128102 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rcbth" podStartSLOduration=253.073454467 podStartE2EDuration="4m14.128086408s" podCreationTimestamp="2026-04-17 21:13:09 +0000 UTC" firstStartedPulling="2026-04-17 21:17:21.643516892 +0000 UTC m=+252.845009744" lastFinishedPulling="2026-04-17 21:17:22.698148819 +0000 UTC m=+253.899641685" observedRunningTime="2026-04-17 21:17:23.127341339 +0000 UTC m=+254.328834213" watchObservedRunningTime="2026-04-17 21:17:23.128086408 +0000 UTC m=+254.329579281" Apr 17 21:17:51.835132 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:17:51.835058 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" podUID="f81f53a6-485f-4654-b5dd-6b5d5f5784c8" Apr 17 21:17:52.183512 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:52.183479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:17:55.261276 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:55.261223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:17:55.263693 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:55.263670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f81f53a6-485f-4654-b5dd-6b5d5f5784c8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zsm2v\" (UID: \"f81f53a6-485f-4654-b5dd-6b5d5f5784c8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:17:55.486047 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:55.486012 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vzxvc\"" Apr 17 21:17:55.494497 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:55.494465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" Apr 17 21:17:55.616327 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:55.616293 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v"] Apr 17 21:17:55.621786 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:17:55.619591 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81f53a6_485f_4654_b5dd_6b5d5f5784c8.slice/crio-0beaa4a2629b064841e336d41f839fdea605fa02b974254e1a31fa7883de4be8 WatchSource:0}: Error finding container 0beaa4a2629b064841e336d41f839fdea605fa02b974254e1a31fa7883de4be8: Status 404 returned error can't find the container with id 0beaa4a2629b064841e336d41f839fdea605fa02b974254e1a31fa7883de4be8 Apr 17 21:17:56.194580 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:56.194540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" event={"ID":"f81f53a6-485f-4654-b5dd-6b5d5f5784c8","Type":"ContainerStarted","Data":"0beaa4a2629b064841e336d41f839fdea605fa02b974254e1a31fa7883de4be8"} Apr 17 21:17:57.198220 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:57.198187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" event={"ID":"f81f53a6-485f-4654-b5dd-6b5d5f5784c8","Type":"ContainerStarted","Data":"6b0d2dbf6d7a301cde3f605eb0946ca957ba5bd1d5fff1968da54e11f27b4787"} Apr 17 21:17:57.213372 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:17:57.213317 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zsm2v" podStartSLOduration=272.143634059 podStartE2EDuration="4m33.213301099s" podCreationTimestamp="2026-04-17 21:13:24 +0000 UTC" firstStartedPulling="2026-04-17 21:17:55.62324878 +0000 UTC m=+286.824741633" lastFinishedPulling="2026-04-17 21:17:56.692915821 +0000 UTC m=+287.894408673" observedRunningTime="2026-04-17 21:17:57.21151216 +0000 UTC m=+288.413005034" watchObservedRunningTime="2026-04-17 21:17:57.213301099 +0000 UTC m=+288.414793973" Apr 17 21:18:09.267184 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:18:09.267155 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 21:20:39.443684 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.443584 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb"] Apr 17 21:20:39.444204 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.443884 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" containerName="registry" Apr 17 21:20:39.444204 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.443899 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" containerName="registry" Apr 17 21:20:39.444204 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.443949 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d28e489-a207-4cf7-9115-97fa588ae515" containerName="registry" Apr 17 21:20:39.446603 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.446586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.449080 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.449055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 21:20:39.449080 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.449073 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:20:39.449917 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.449896 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-44fmg\"" Apr 17 21:20:39.453592 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.453570 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb"] Apr 17 21:20:39.587762 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.587726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b6afc13-c81b-4dd2-90c2-699a2fad97dd-tmp\") pod \"openshift-lws-operator-bfc7f696d-k8bwb\" (UID: \"6b6afc13-c81b-4dd2-90c2-699a2fad97dd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.587925 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.587768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpq84\" (UniqueName: \"kubernetes.io/projected/6b6afc13-c81b-4dd2-90c2-699a2fad97dd-kube-api-access-cpq84\") pod \"openshift-lws-operator-bfc7f696d-k8bwb\" (UID: \"6b6afc13-c81b-4dd2-90c2-699a2fad97dd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.688764 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.688708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b6afc13-c81b-4dd2-90c2-699a2fad97dd-tmp\") pod \"openshift-lws-operator-bfc7f696d-k8bwb\" (UID: \"6b6afc13-c81b-4dd2-90c2-699a2fad97dd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.688961 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.688778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpq84\" (UniqueName: \"kubernetes.io/projected/6b6afc13-c81b-4dd2-90c2-699a2fad97dd-kube-api-access-cpq84\") pod \"openshift-lws-operator-bfc7f696d-k8bwb\" (UID: \"6b6afc13-c81b-4dd2-90c2-699a2fad97dd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.689111 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.689087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b6afc13-c81b-4dd2-90c2-699a2fad97dd-tmp\") pod \"openshift-lws-operator-bfc7f696d-k8bwb\" (UID: \"6b6afc13-c81b-4dd2-90c2-699a2fad97dd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.705966 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.705907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpq84\" (UniqueName: \"kubernetes.io/projected/6b6afc13-c81b-4dd2-90c2-699a2fad97dd-kube-api-access-cpq84\") pod \"openshift-lws-operator-bfc7f696d-k8bwb\" (UID: \"6b6afc13-c81b-4dd2-90c2-699a2fad97dd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.755415 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.755388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" Apr 17 21:20:39.867325 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.867296 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb"] Apr 17 21:20:39.870069 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:20:39.870042 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b6afc13_c81b_4dd2_90c2_699a2fad97dd.slice/crio-d74ccf3adfe32f9c17aa202cfb8e1cb0dce36f155fea9dc879f89b4fd6e3a0cf WatchSource:0}: Error finding container d74ccf3adfe32f9c17aa202cfb8e1cb0dce36f155fea9dc879f89b4fd6e3a0cf: Status 404 returned error can't find the container with id d74ccf3adfe32f9c17aa202cfb8e1cb0dce36f155fea9dc879f89b4fd6e3a0cf Apr 17 21:20:39.871367 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:39.871349 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:20:40.607939 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:40.607899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" event={"ID":"6b6afc13-c81b-4dd2-90c2-699a2fad97dd","Type":"ContainerStarted","Data":"d74ccf3adfe32f9c17aa202cfb8e1cb0dce36f155fea9dc879f89b4fd6e3a0cf"} Apr 17 21:20:43.616808 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:43.616770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" event={"ID":"6b6afc13-c81b-4dd2-90c2-699a2fad97dd","Type":"ContainerStarted","Data":"426d8f01e630bc0004b089b38bbb5f3422eb8e89e6d87d078e87c362e85c9b88"} Apr 17 21:20:43.632698 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:20:43.632636 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k8bwb" podStartSLOduration=1.487550822 podStartE2EDuration="4.632623031s" podCreationTimestamp="2026-04-17 21:20:39 +0000 UTC" firstStartedPulling="2026-04-17 21:20:39.871466161 +0000 UTC m=+451.072959012" lastFinishedPulling="2026-04-17 21:20:43.016538368 +0000 UTC m=+454.218031221" observedRunningTime="2026-04-17 21:20:43.631264724 +0000 UTC m=+454.832757597" watchObservedRunningTime="2026-04-17 21:20:43.632623031 +0000 UTC m=+454.834115904" Apr 17 21:21:00.490212 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.490175 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn"] Apr 17 21:21:00.493027 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.493002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.495492 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.495442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 21:21:00.495492 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.495488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-2j2w8\"" Apr 17 21:21:00.495680 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.495593 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 21:21:00.495732 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.495694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 21:21:00.495769 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.495737 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 21:21:00.505462 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.505442 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn"] Apr 17 21:21:00.539267 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.539234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mdq6\" (UniqueName: \"kubernetes.io/projected/335e1c2e-d681-45f4-a58f-ead32b0515c7-kube-api-access-7mdq6\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.539267 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.539264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/335e1c2e-d681-45f4-a58f-ead32b0515c7-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.539440 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.539291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/335e1c2e-d681-45f4-a58f-ead32b0515c7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.639899 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.639865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/335e1c2e-d681-45f4-a58f-ead32b0515c7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.640066 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.639925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mdq6\" (UniqueName: \"kubernetes.io/projected/335e1c2e-d681-45f4-a58f-ead32b0515c7-kube-api-access-7mdq6\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.640066 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.639947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/335e1c2e-d681-45f4-a58f-ead32b0515c7-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.642302 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.642272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/335e1c2e-d681-45f4-a58f-ead32b0515c7-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.642408 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.642382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/335e1c2e-d681-45f4-a58f-ead32b0515c7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.647516 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.647495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mdq6\" (UniqueName: \"kubernetes.io/projected/335e1c2e-d681-45f4-a58f-ead32b0515c7-kube-api-access-7mdq6\") pod \"opendatahub-operator-controller-manager-694fdf7c65-tzlqn\" (UID: \"335e1c2e-d681-45f4-a58f-ead32b0515c7\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.803423 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.803342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:00.917731 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:00.917699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn"] Apr 17 21:21:00.921287 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:21:00.921258 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod335e1c2e_d681_45f4_a58f_ead32b0515c7.slice/crio-34a76901062500d7890b7fe9339f0395d609376f326b0eea274896a8a680b8d4 WatchSource:0}: Error finding container 34a76901062500d7890b7fe9339f0395d609376f326b0eea274896a8a680b8d4: Status 404 returned error can't find the container with id 34a76901062500d7890b7fe9339f0395d609376f326b0eea274896a8a680b8d4 Apr 17 21:21:01.670446 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:01.670368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" event={"ID":"335e1c2e-d681-45f4-a58f-ead32b0515c7","Type":"ContainerStarted","Data":"34a76901062500d7890b7fe9339f0395d609376f326b0eea274896a8a680b8d4"} Apr 17 21:21:03.676201 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:03.676116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" event={"ID":"335e1c2e-d681-45f4-a58f-ead32b0515c7","Type":"ContainerStarted","Data":"b14d2fe9b0a4091e3618d568dec20789f5785d97859ad7728488023477cd3588"} Apr 17 21:21:03.676576 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:03.676255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:03.698214 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:03.698160 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" podStartSLOduration=1.271072585 podStartE2EDuration="3.698144578s" podCreationTimestamp="2026-04-17 21:21:00 +0000 UTC" firstStartedPulling="2026-04-17 21:21:00.922889622 +0000 UTC m=+472.124382474" lastFinishedPulling="2026-04-17 21:21:03.349961611 +0000 UTC m=+474.551454467" observedRunningTime="2026-04-17 21:21:03.696406592 +0000 UTC m=+474.897899467" watchObservedRunningTime="2026-04-17 21:21:03.698144578 +0000 UTC m=+474.899637455" Apr 17 21:21:14.681677 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:14.681624 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-tzlqn" Apr 17 21:21:22.082243 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.082208 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-qjchn"] Apr 17 21:21:22.094965 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.094929 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-qjchn"] Apr 17 21:21:22.095123 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.095048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.097189 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.097163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-575c2\"" Apr 17 21:21:22.097189 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.097178 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 21:21:22.198156 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.198115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.198334 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.198175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxtz\" (UniqueName: \"kubernetes.io/projected/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-kube-api-access-5vxtz\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.298524 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.298489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.298641 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.298544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxtz\" (UniqueName: \"kubernetes.io/projected/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-kube-api-access-5vxtz\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.298713 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:22.298664 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 21:21:22.298751 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:22.298738 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert podName:c12a37a5-5ddd-4c3f-b3db-3f1d14abc307 nodeName:}" failed. No retries permitted until 2026-04-17 21:21:22.798721168 +0000 UTC m=+494.000214020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert") pod "odh-model-controller-858dbf95b8-qjchn" (UID: "c12a37a5-5ddd-4c3f-b3db-3f1d14abc307") : secret "odh-model-controller-webhook-cert" not found Apr 17 21:21:22.306636 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.306610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxtz\" (UniqueName: \"kubernetes.io/projected/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-kube-api-access-5vxtz\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.803363 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:22.803325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:22.803538 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:22.803431 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 21:21:22.803538 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:22.803487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert podName:c12a37a5-5ddd-4c3f-b3db-3f1d14abc307 nodeName:}" failed. No retries permitted until 2026-04-17 21:21:23.803473312 +0000 UTC m=+495.004966163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert") pod "odh-model-controller-858dbf95b8-qjchn" (UID: "c12a37a5-5ddd-4c3f-b3db-3f1d14abc307") : secret "odh-model-controller-webhook-cert" not found Apr 17 21:21:23.810724 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:23.810692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:23.813070 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:23.813039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c12a37a5-5ddd-4c3f-b3db-3f1d14abc307-cert\") pod \"odh-model-controller-858dbf95b8-qjchn\" (UID: \"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307\") " pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:23.904844 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:23.904807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:24.016999 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.016968 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-qjchn"] Apr 17 21:21:24.019837 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:21:24.019809 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc12a37a5_5ddd_4c3f_b3db_3f1d14abc307.slice/crio-22d009e461953d7818fe320508a7fcbaea8bbc069a987d86722f5ce06dcc5129 WatchSource:0}: Error finding container 22d009e461953d7818fe320508a7fcbaea8bbc069a987d86722f5ce06dcc5129: Status 404 returned error can't find the container with id 22d009e461953d7818fe320508a7fcbaea8bbc069a987d86722f5ce06dcc5129 Apr 17 21:21:24.626422 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.626390 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h"] Apr 17 21:21:24.629636 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.629606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.631988 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.631957 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 21:21:24.632127 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.632103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 21:21:24.632208 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.631960 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 21:21:24.632452 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.632295 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 21:21:24.638614 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.638394 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h"] Apr 17 21:21:24.638762 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.633514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-phnf9\"" Apr 17 21:21:24.718202 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.718164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tsfm\" (UniqueName: \"kubernetes.io/projected/1fb263c3-60ed-4d7a-83a0-c7828118a401-kube-api-access-2tsfm\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.718202 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.718204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fb263c3-60ed-4d7a-83a0-c7828118a401-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.718431 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.718291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fb263c3-60ed-4d7a-83a0-c7828118a401-tmp\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.732340 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.732303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" event={"ID":"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307","Type":"ContainerStarted","Data":"22d009e461953d7818fe320508a7fcbaea8bbc069a987d86722f5ce06dcc5129"} Apr 17 21:21:24.818849 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.818812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fb263c3-60ed-4d7a-83a0-c7828118a401-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.819242 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.818862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fb263c3-60ed-4d7a-83a0-c7828118a401-tmp\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.819242 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.818914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tsfm\" (UniqueName: \"kubernetes.io/projected/1fb263c3-60ed-4d7a-83a0-c7828118a401-kube-api-access-2tsfm\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.819242 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:24.818964 2576 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 17 21:21:24.819242 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:24.819046 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fb263c3-60ed-4d7a-83a0-c7828118a401-tls-certs podName:1fb263c3-60ed-4d7a-83a0-c7828118a401 nodeName:}" failed. No retries permitted until 2026-04-17 21:21:25.319029785 +0000 UTC m=+496.520522636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1fb263c3-60ed-4d7a-83a0-c7828118a401-tls-certs") pod "kube-auth-proxy-56dddbd4f7-pfk7h" (UID: "1fb263c3-60ed-4d7a-83a0-c7828118a401") : secret "kube-auth-proxy-tls" not found Apr 17 21:21:24.821156 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.821134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fb263c3-60ed-4d7a-83a0-c7828118a401-tmp\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:24.828033 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:24.827973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tsfm\" (UniqueName: \"kubernetes.io/projected/1fb263c3-60ed-4d7a-83a0-c7828118a401-kube-api-access-2tsfm\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:25.323894 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:25.323854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fb263c3-60ed-4d7a-83a0-c7828118a401-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:25.327246 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:25.327215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fb263c3-60ed-4d7a-83a0-c7828118a401-tls-certs\") pod \"kube-auth-proxy-56dddbd4f7-pfk7h\" (UID: \"1fb263c3-60ed-4d7a-83a0-c7828118a401\") " pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:25.547435 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:25.547393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" Apr 17 21:21:25.717161 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:25.717130 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h"] Apr 17 21:21:26.733353 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:21:26.733310 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb263c3_60ed_4d7a_83a0_c7828118a401.slice/crio-3e8f8429b56b8be61352ae0f4eb8ad88ffdd2c48a5251546fca044275ff144a7 WatchSource:0}: Error finding container 3e8f8429b56b8be61352ae0f4eb8ad88ffdd2c48a5251546fca044275ff144a7: Status 404 returned error can't find the container with id 3e8f8429b56b8be61352ae0f4eb8ad88ffdd2c48a5251546fca044275ff144a7 Apr 17 21:21:26.739759 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:26.739728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" event={"ID":"1fb263c3-60ed-4d7a-83a0-c7828118a401","Type":"ContainerStarted","Data":"3e8f8429b56b8be61352ae0f4eb8ad88ffdd2c48a5251546fca044275ff144a7"} Apr 17 21:21:27.203050 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.203005 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-d7dzj"] Apr 17 21:21:27.206666 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.206592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.209063 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.209032 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 21:21:27.209233 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.209215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-gds8t\"" Apr 17 21:21:27.212434 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.212406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-d7dzj"] Apr 17 21:21:27.240008 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.239965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8c31f98-fb3d-4210-a0d8-cad1450fb582-cert\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.240201 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.240055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b5s7\" (UniqueName: \"kubernetes.io/projected/a8c31f98-fb3d-4210-a0d8-cad1450fb582-kube-api-access-9b5s7\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.341500 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.341467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8c31f98-fb3d-4210-a0d8-cad1450fb582-cert\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.341719 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.341526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9b5s7\" (UniqueName: \"kubernetes.io/projected/a8c31f98-fb3d-4210-a0d8-cad1450fb582-kube-api-access-9b5s7\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.341719 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:27.341632 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 21:21:27.341859 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:21:27.341736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8c31f98-fb3d-4210-a0d8-cad1450fb582-cert podName:a8c31f98-fb3d-4210-a0d8-cad1450fb582 nodeName:}" failed. No retries permitted until 2026-04-17 21:21:27.841713744 +0000 UTC m=+499.043206601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8c31f98-fb3d-4210-a0d8-cad1450fb582-cert") pod "kserve-controller-manager-856948b99f-d7dzj" (UID: "a8c31f98-fb3d-4210-a0d8-cad1450fb582") : secret "kserve-webhook-server-cert" not found Apr 17 21:21:27.367905 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.367875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b5s7\" (UniqueName: \"kubernetes.io/projected/a8c31f98-fb3d-4210-a0d8-cad1450fb582-kube-api-access-9b5s7\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.745089 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.745049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" event={"ID":"c12a37a5-5ddd-4c3f-b3db-3f1d14abc307","Type":"ContainerStarted","Data":"53c89fd614c99aaf98ba09ddd447444a630691d5fbbf6085a511a556048bdedc"} Apr 17 21:21:27.745580 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.745268 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:27.764297 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.764235 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" podStartSLOduration=3.003861943 podStartE2EDuration="5.764215191s" podCreationTimestamp="2026-04-17 21:21:22 +0000 UTC" firstStartedPulling="2026-04-17 21:21:24.021141667 +0000 UTC m=+495.222634518" lastFinishedPulling="2026-04-17 21:21:26.781494913 +0000 UTC m=+497.982987766" observedRunningTime="2026-04-17 21:21:27.762787305 +0000 UTC m=+498.964280180" watchObservedRunningTime="2026-04-17 21:21:27.764215191 +0000 UTC m=+498.965708069" Apr 17 21:21:27.845957 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.845771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8c31f98-fb3d-4210-a0d8-cad1450fb582-cert\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:27.850188 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:27.850139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8c31f98-fb3d-4210-a0d8-cad1450fb582-cert\") pod \"kserve-controller-manager-856948b99f-d7dzj\" (UID: \"a8c31f98-fb3d-4210-a0d8-cad1450fb582\") " pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:28.124332 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:28.124288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:29.452502 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:29.452469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-d7dzj"] Apr 17 21:21:29.455427 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:21:29.455397 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c31f98_fb3d_4210_a0d8_cad1450fb582.slice/crio-b12636cd9e2125097dae808d7cf67cccd9ed8143b3c255001b65055927531279 WatchSource:0}: Error finding container b12636cd9e2125097dae808d7cf67cccd9ed8143b3c255001b65055927531279: Status 404 returned error can't find the container with id b12636cd9e2125097dae808d7cf67cccd9ed8143b3c255001b65055927531279 Apr 17 21:21:29.752832 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:29.752795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" event={"ID":"1fb263c3-60ed-4d7a-83a0-c7828118a401","Type":"ContainerStarted","Data":"d6bb2da7b038f1c8b52b510d0ce86bb1516bc063e09a2347bc7c1298dbdf14e0"} Apr 17 21:21:29.753842 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:29.753816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" event={"ID":"a8c31f98-fb3d-4210-a0d8-cad1450fb582","Type":"ContainerStarted","Data":"b12636cd9e2125097dae808d7cf67cccd9ed8143b3c255001b65055927531279"} Apr 17 21:21:29.768492 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:29.768444 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-56dddbd4f7-pfk7h" podStartSLOduration=3.117832803 podStartE2EDuration="5.768430706s" podCreationTimestamp="2026-04-17 21:21:24 +0000 UTC" firstStartedPulling="2026-04-17 21:21:26.73511224 +0000 UTC m=+497.936605092" lastFinishedPulling="2026-04-17 21:21:29.385710141 +0000 UTC m=+500.587202995" observedRunningTime="2026-04-17 21:21:29.76764336 +0000 UTC m=+500.969136233" watchObservedRunningTime="2026-04-17 21:21:29.768430706 +0000 UTC m=+500.969923579" Apr 17 21:21:32.764840 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:32.764797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" event={"ID":"a8c31f98-fb3d-4210-a0d8-cad1450fb582","Type":"ContainerStarted","Data":"9864f787f0569b3b1a4e87909d61afd4ee6e26c7768010270983689f09475687"} Apr 17 21:21:32.765196 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:32.764883 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:21:32.779575 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:32.779528 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" podStartSLOduration=2.920959528 podStartE2EDuration="5.779514928s" podCreationTimestamp="2026-04-17 21:21:27 +0000 UTC" firstStartedPulling="2026-04-17 21:21:29.456817602 +0000 UTC m=+500.658310455" lastFinishedPulling="2026-04-17 21:21:32.315373001 +0000 UTC m=+503.516865855" observedRunningTime="2026-04-17 21:21:32.778413715 +0000 UTC m=+503.979906601" watchObservedRunningTime="2026-04-17 21:21:32.779514928 +0000 UTC m=+503.981007803" Apr 17 21:21:38.751605 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:38.751573 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-qjchn" Apr 17 21:21:39.489881 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.489844 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cbssq"] Apr 17 21:21:39.493194 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.493171 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.497612 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.497582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 21:21:39.497903 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.497885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 21:21:39.498542 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.498525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-ch48r\"" Apr 17 21:21:39.509396 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.509364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cbssq"] Apr 17 21:21:39.538073 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.538035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6a94c838-8633-4425-9f9e-1bc1a2cfbd4f-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cbssq\" (UID: \"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.538246 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.538104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfg85\" (UniqueName: \"kubernetes.io/projected/6a94c838-8633-4425-9f9e-1bc1a2cfbd4f-kube-api-access-pfg85\") pod \"servicemesh-operator3-55f49c5f94-cbssq\" (UID: \"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.639320 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.639284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6a94c838-8633-4425-9f9e-1bc1a2cfbd4f-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cbssq\" (UID: \"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.639508 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.639339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfg85\" (UniqueName: \"kubernetes.io/projected/6a94c838-8633-4425-9f9e-1bc1a2cfbd4f-kube-api-access-pfg85\") pod \"servicemesh-operator3-55f49c5f94-cbssq\" (UID: \"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.641867 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.641847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6a94c838-8633-4425-9f9e-1bc1a2cfbd4f-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cbssq\" (UID: \"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.647585 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.647560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfg85\" (UniqueName: \"kubernetes.io/projected/6a94c838-8633-4425-9f9e-1bc1a2cfbd4f-kube-api-access-pfg85\") pod \"servicemesh-operator3-55f49c5f94-cbssq\" (UID: \"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.803206 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.803102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:39.931158 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:39.931119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cbssq"] Apr 17 21:21:39.934818 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:21:39.934785 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a94c838_8633_4425_9f9e_1bc1a2cfbd4f.slice/crio-b786744554cf2a5696a7712c97b4ffb3df1a0c4e0b82b7b42b3a9b6ce9621f66 WatchSource:0}: Error finding container b786744554cf2a5696a7712c97b4ffb3df1a0c4e0b82b7b42b3a9b6ce9621f66: Status 404 returned error can't find the container with id b786744554cf2a5696a7712c97b4ffb3df1a0c4e0b82b7b42b3a9b6ce9621f66 Apr 17 21:21:40.790741 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:40.790697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" event={"ID":"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f","Type":"ContainerStarted","Data":"b786744554cf2a5696a7712c97b4ffb3df1a0c4e0b82b7b42b3a9b6ce9621f66"} Apr 17 21:21:44.807057 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:44.807013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" event={"ID":"6a94c838-8633-4425-9f9e-1bc1a2cfbd4f","Type":"ContainerStarted","Data":"4640d7499527048e2243f9b394ce9e1cca523afabe5716815e977a1699356d58"} Apr 17 21:21:44.807510 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:44.807155 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:44.832506 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:44.832440 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" podStartSLOduration=1.155500573 podStartE2EDuration="5.832424549s" podCreationTimestamp="2026-04-17 21:21:39 +0000 UTC" firstStartedPulling="2026-04-17 21:21:39.938130589 +0000 UTC m=+511.139623442" lastFinishedPulling="2026-04-17 21:21:44.61505455 +0000 UTC m=+515.816547418" observedRunningTime="2026-04-17 21:21:44.830315185 +0000 UTC m=+516.031808061" watchObservedRunningTime="2026-04-17 21:21:44.832424549 +0000 UTC m=+516.033917424" Apr 17 21:21:55.812614 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:55.812578 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cbssq" Apr 17 21:21:56.612958 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.612923 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp"] Apr 17 21:21:56.616290 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.616270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.619051 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.619021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 21:21:56.619051 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.619039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 21:21:56.619271 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.619038 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 21:21:56.619271 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.619070 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 21:21:56.620048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.620033 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-fm7zb\"" Apr 17 21:21:56.628300 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.628269 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp"] Apr 17 21:21:56.767272 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.767465 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.767465 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.767465 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx8x7\" (UniqueName: \"kubernetes.io/projected/b68c0e20-2a48-4332-b51b-3c2102fe7149-kube-api-access-gx8x7\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.767632 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b68c0e20-2a48-4332-b51b-3c2102fe7149-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.767632 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.767632 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.767574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868289 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868289 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868289 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8x7\" (UniqueName: \"kubernetes.io/projected/b68c0e20-2a48-4332-b51b-3c2102fe7149-kube-api-access-gx8x7\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b68c0e20-2a48-4332-b51b-3c2102fe7149-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.868914 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.869086 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.868928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.871448 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.871411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.871571 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.871411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b68c0e20-2a48-4332-b51b-3c2102fe7149-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.871571 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.871506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.871571 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.871518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b68c0e20-2a48-4332-b51b-3c2102fe7149-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.885977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.885948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b68c0e20-2a48-4332-b51b-3c2102fe7149-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.886535 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.886512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8x7\" (UniqueName: \"kubernetes.io/projected/b68c0e20-2a48-4332-b51b-3c2102fe7149-kube-api-access-gx8x7\") pod \"istiod-openshift-gateway-55ff986f96-95drp\" (UID: \"b68c0e20-2a48-4332-b51b-3c2102fe7149\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:56.926033 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:56.925984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:21:57.072889 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:57.072859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp"] Apr 17 21:21:57.074369 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:21:57.074333 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb68c0e20_2a48_4332_b51b_3c2102fe7149.slice/crio-1dcd08934da9586ea1f727a9824f930231eaf1a21a15cd3218dbca0a89699ad8 WatchSource:0}: Error finding container 1dcd08934da9586ea1f727a9824f930231eaf1a21a15cd3218dbca0a89699ad8: Status 404 returned error can't find the container with id 1dcd08934da9586ea1f727a9824f930231eaf1a21a15cd3218dbca0a89699ad8 Apr 17 21:21:57.853760 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:21:57.853712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" event={"ID":"b68c0e20-2a48-4332-b51b-3c2102fe7149","Type":"ContainerStarted","Data":"1dcd08934da9586ea1f727a9824f930231eaf1a21a15cd3218dbca0a89699ad8"} Apr 17 21:22:00.178164 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:00.178116 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:22:00.178489 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:00.178208 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:22:00.865629 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:00.865567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" event={"ID":"b68c0e20-2a48-4332-b51b-3c2102fe7149","Type":"ContainerStarted","Data":"f5d0ce52856462edd5947b83d66cb5e1814990370d34e330579a2a2b39d505fb"} Apr 17 21:22:00.865871 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:00.865790 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:22:00.884935 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:00.884879 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" podStartSLOduration=1.783149305 podStartE2EDuration="4.884862009s" podCreationTimestamp="2026-04-17 21:21:56 +0000 UTC" firstStartedPulling="2026-04-17 21:21:57.076130811 +0000 UTC m=+528.277623663" lastFinishedPulling="2026-04-17 21:22:00.177843515 +0000 UTC m=+531.379336367" observedRunningTime="2026-04-17 21:22:00.882921484 +0000 UTC m=+532.084414359" watchObservedRunningTime="2026-04-17 21:22:00.884862009 +0000 UTC m=+532.086354885" Apr 17 21:22:01.870563 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:01.870533 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-95drp" Apr 17 21:22:03.774086 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:03.774046 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-d7dzj" Apr 17 21:22:50.056189 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.056150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs"] Apr 17 21:22:50.058637 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.058613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:22:50.061108 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.061082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 21:22:50.061222 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.061128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:22:50.061222 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.061088 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:22:50.061917 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.061903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-6sddx\"" Apr 17 21:22:50.069133 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.069106 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs"] Apr 17 21:22:50.069498 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.069478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmfn\" (UniqueName: \"kubernetes.io/projected/d240f8bc-a3db-485f-abdb-9a9305b4aa8f-kube-api-access-6hmfn\") pod \"dns-operator-controller-manager-648d5c98bc-hgjcs\" (UID: \"d240f8bc-a3db-485f-abdb-9a9305b4aa8f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:22:50.169989 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.169948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmfn\" (UniqueName: \"kubernetes.io/projected/d240f8bc-a3db-485f-abdb-9a9305b4aa8f-kube-api-access-6hmfn\") pod \"dns-operator-controller-manager-648d5c98bc-hgjcs\" (UID: \"d240f8bc-a3db-485f-abdb-9a9305b4aa8f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:22:50.185629 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.185601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmfn\" (UniqueName: \"kubernetes.io/projected/d240f8bc-a3db-485f-abdb-9a9305b4aa8f-kube-api-access-6hmfn\") pod \"dns-operator-controller-manager-648d5c98bc-hgjcs\" (UID: \"d240f8bc-a3db-485f-abdb-9a9305b4aa8f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:22:50.368445 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.368409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:22:50.502907 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:50.502881 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs"] Apr 17 21:22:50.505118 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:22:50.505084 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd240f8bc_a3db_485f_abdb_9a9305b4aa8f.slice/crio-05d901338b4bee524d92f94b2015a80f22681fda98ef2d0d64d146b2f0d3fba2 WatchSource:0}: Error finding container 05d901338b4bee524d92f94b2015a80f22681fda98ef2d0d64d146b2f0d3fba2: Status 404 returned error can't find the container with id 05d901338b4bee524d92f94b2015a80f22681fda98ef2d0d64d146b2f0d3fba2 Apr 17 21:22:51.020341 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:51.020296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" event={"ID":"d240f8bc-a3db-485f-abdb-9a9305b4aa8f","Type":"ContainerStarted","Data":"05d901338b4bee524d92f94b2015a80f22681fda98ef2d0d64d146b2f0d3fba2"} Apr 17 21:22:54.030981 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:54.030941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" event={"ID":"d240f8bc-a3db-485f-abdb-9a9305b4aa8f","Type":"ContainerStarted","Data":"7b784d1d5163db85086d0f4dea74962a8654c5394608630b3306dc571750dc6f"} Apr 17 21:22:54.031349 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:54.031127 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:22:54.057786 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:54.057737 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" podStartSLOduration=1.5638566200000001 podStartE2EDuration="4.057720701s" podCreationTimestamp="2026-04-17 21:22:50 +0000 UTC" firstStartedPulling="2026-04-17 21:22:50.507401336 +0000 UTC m=+581.708894189" lastFinishedPulling="2026-04-17 21:22:53.0012654 +0000 UTC m=+584.202758270" observedRunningTime="2026-04-17 21:22:54.05644544 +0000 UTC m=+585.257938314" watchObservedRunningTime="2026-04-17 21:22:54.057720701 +0000 UTC m=+585.259213573" Apr 17 21:22:58.838621 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:58.838583 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-l9467"] Apr 17 21:22:58.840977 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:58.840956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:22:58.846240 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:58.846221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4b2bb\"" Apr 17 21:22:58.857296 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:58.857270 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-l9467"] Apr 17 21:22:58.939104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:58.939065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvbbg\" (UniqueName: \"kubernetes.io/projected/1cbc34c7-b672-4646-9831-6f4eda242e7b-kube-api-access-jvbbg\") pod \"authorino-operator-657f44b778-l9467\" (UID: \"1cbc34c7-b672-4646-9831-6f4eda242e7b\") " pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:22:59.040411 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:59.040355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvbbg\" (UniqueName: \"kubernetes.io/projected/1cbc34c7-b672-4646-9831-6f4eda242e7b-kube-api-access-jvbbg\") pod \"authorino-operator-657f44b778-l9467\" (UID: \"1cbc34c7-b672-4646-9831-6f4eda242e7b\") " pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:22:59.056443 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:59.056419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvbbg\" (UniqueName: \"kubernetes.io/projected/1cbc34c7-b672-4646-9831-6f4eda242e7b-kube-api-access-jvbbg\") pod \"authorino-operator-657f44b778-l9467\" (UID: \"1cbc34c7-b672-4646-9831-6f4eda242e7b\") " pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:22:59.151868 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:59.151786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:22:59.271907 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:22:59.271883 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-l9467"] Apr 17 21:22:59.274684 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:22:59.274641 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cbc34c7_b672_4646_9831_6f4eda242e7b.slice/crio-ede61582d63a308094a7aa0cccf4ad23076fc20b591f2aba026333bcdca5657b WatchSource:0}: Error finding container ede61582d63a308094a7aa0cccf4ad23076fc20b591f2aba026333bcdca5657b: Status 404 returned error can't find the container with id ede61582d63a308094a7aa0cccf4ad23076fc20b591f2aba026333bcdca5657b Apr 17 21:23:00.051622 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:00.051578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-l9467" event={"ID":"1cbc34c7-b672-4646-9831-6f4eda242e7b","Type":"ContainerStarted","Data":"ede61582d63a308094a7aa0cccf4ad23076fc20b591f2aba026333bcdca5657b"} Apr 17 21:23:02.059592 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:02.059559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-l9467" event={"ID":"1cbc34c7-b672-4646-9831-6f4eda242e7b","Type":"ContainerStarted","Data":"993f1208dccc8336c17cd9168ff2a483e2a1060e112362c01aa9047e2d59614b"} Apr 17 21:23:02.060009 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:02.059698 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:23:02.083099 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:02.083034 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-l9467" podStartSLOduration=2.229458494 podStartE2EDuration="4.083019556s" podCreationTimestamp="2026-04-17 21:22:58 +0000 UTC" firstStartedPulling="2026-04-17 21:22:59.277055344 +0000 UTC m=+590.478548196" lastFinishedPulling="2026-04-17 21:23:01.130616403 +0000 UTC m=+592.332109258" observedRunningTime="2026-04-17 21:23:02.082109654 +0000 UTC m=+593.283602528" watchObservedRunningTime="2026-04-17 21:23:02.083019556 +0000 UTC m=+593.284512430" Apr 17 21:23:05.036474 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:05.036441 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hgjcs" Apr 17 21:23:13.065779 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:13.065746 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-l9467" Apr 17 21:23:14.751590 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.751553 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7"] Apr 17 21:23:14.753911 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.753894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:23:14.756249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.756228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-jzsl7\"" Apr 17 21:23:14.765249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.765223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7"] Apr 17 21:23:14.862790 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.862753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4flsk\" (UniqueName: \"kubernetes.io/projected/8312822f-a2ce-4aec-a7c8-96ec612fb886-kube-api-access-4flsk\") pod \"limitador-operator-controller-manager-85c4996f8c-92bv7\" (UID: \"8312822f-a2ce-4aec-a7c8-96ec612fb886\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:23:14.963739 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.963698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4flsk\" (UniqueName: \"kubernetes.io/projected/8312822f-a2ce-4aec-a7c8-96ec612fb886-kube-api-access-4flsk\") pod \"limitador-operator-controller-manager-85c4996f8c-92bv7\" (UID: \"8312822f-a2ce-4aec-a7c8-96ec612fb886\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:23:14.977083 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:14.977055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4flsk\" (UniqueName: \"kubernetes.io/projected/8312822f-a2ce-4aec-a7c8-96ec612fb886-kube-api-access-4flsk\") pod \"limitador-operator-controller-manager-85c4996f8c-92bv7\" (UID: \"8312822f-a2ce-4aec-a7c8-96ec612fb886\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:23:15.064423 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:15.064312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:23:15.190689 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:15.190638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7"] Apr 17 21:23:15.195318 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:23:15.195284 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8312822f_a2ce_4aec_a7c8_96ec612fb886.slice/crio-8395b336b87cfecc9c70078ecc50fcd1fe0efc330afefd6eed03dc462b946ce3 WatchSource:0}: Error finding container 8395b336b87cfecc9c70078ecc50fcd1fe0efc330afefd6eed03dc462b946ce3: Status 404 returned error can't find the container with id 8395b336b87cfecc9c70078ecc50fcd1fe0efc330afefd6eed03dc462b946ce3 Apr 17 21:23:16.106581 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:16.106539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" event={"ID":"8312822f-a2ce-4aec-a7c8-96ec612fb886","Type":"ContainerStarted","Data":"8395b336b87cfecc9c70078ecc50fcd1fe0efc330afefd6eed03dc462b946ce3"} Apr 17 21:23:17.111132 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:17.111097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" event={"ID":"8312822f-a2ce-4aec-a7c8-96ec612fb886","Type":"ContainerStarted","Data":"0d2521bd03d951052d70f6f12c6878bdaf8958a0f3bbbe586adf545f1e84d4f0"} Apr 17 21:23:17.111531 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:17.111205 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:23:17.127334 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:17.127268 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" podStartSLOduration=1.589969472 podStartE2EDuration="3.127251334s" podCreationTimestamp="2026-04-17 21:23:14 +0000 UTC" firstStartedPulling="2026-04-17 21:23:15.197270122 +0000 UTC m=+606.398762974" lastFinishedPulling="2026-04-17 21:23:16.734551985 +0000 UTC m=+607.936044836" observedRunningTime="2026-04-17 21:23:17.125973449 +0000 UTC m=+608.327466326" watchObservedRunningTime="2026-04-17 21:23:17.127251334 +0000 UTC m=+608.328744208" Apr 17 21:23:28.116433 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:23:28.116400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92bv7" Apr 17 21:24:34.836935 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:34.836899 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q6rc6"] Apr 17 21:24:34.840140 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:34.840119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:34.842742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:34.842711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-mv65r\"" Apr 17 21:24:34.848804 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:34.848778 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q6rc6"] Apr 17 21:24:34.905196 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:34.905156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxvb\" (UniqueName: \"kubernetes.io/projected/bc4cf9bf-3355-4815-9686-e3f893f06d49-kube-api-access-jsxvb\") pod \"maas-controller-6d4c8f55f9-q6rc6\" (UID: \"bc4cf9bf-3355-4815-9686-e3f893f06d49\") " pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:35.006524 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:35.006489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxvb\" (UniqueName: \"kubernetes.io/projected/bc4cf9bf-3355-4815-9686-e3f893f06d49-kube-api-access-jsxvb\") pod \"maas-controller-6d4c8f55f9-q6rc6\" (UID: \"bc4cf9bf-3355-4815-9686-e3f893f06d49\") " pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:35.014486 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:35.014456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxvb\" (UniqueName: \"kubernetes.io/projected/bc4cf9bf-3355-4815-9686-e3f893f06d49-kube-api-access-jsxvb\") pod \"maas-controller-6d4c8f55f9-q6rc6\" (UID: \"bc4cf9bf-3355-4815-9686-e3f893f06d49\") " pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:35.096475 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:35.096379 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q6rc6"] Apr 17 21:24:35.096629 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:35.096618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:35.239091 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:35.239065 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q6rc6"] Apr 17 21:24:35.241364 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:24:35.241338 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4cf9bf_3355_4815_9686_e3f893f06d49.slice/crio-c837c06f81dc7b570ed39496b31477a8fe72a9e372438320e0fea5e7ceed938c WatchSource:0}: Error finding container c837c06f81dc7b570ed39496b31477a8fe72a9e372438320e0fea5e7ceed938c: Status 404 returned error can't find the container with id c837c06f81dc7b570ed39496b31477a8fe72a9e372438320e0fea5e7ceed938c Apr 17 21:24:35.359767 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:35.359681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" event={"ID":"bc4cf9bf-3355-4815-9686-e3f893f06d49","Type":"ContainerStarted","Data":"c837c06f81dc7b570ed39496b31477a8fe72a9e372438320e0fea5e7ceed938c"} Apr 17 21:24:38.371384 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.371351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" event={"ID":"bc4cf9bf-3355-4815-9686-e3f893f06d49","Type":"ContainerStarted","Data":"5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c"} Apr 17 21:24:38.371807 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.371466 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" podUID="bc4cf9bf-3355-4815-9686-e3f893f06d49" containerName="manager" containerID="cri-o://5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c" gracePeriod=10 Apr 17 21:24:38.371807 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.371502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:38.386431 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.386385 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" podStartSLOduration=2.232641412 podStartE2EDuration="4.38637247s" podCreationTimestamp="2026-04-17 21:24:34 +0000 UTC" firstStartedPulling="2026-04-17 21:24:35.242461795 +0000 UTC m=+686.443954648" lastFinishedPulling="2026-04-17 21:24:37.396192853 +0000 UTC m=+688.597685706" observedRunningTime="2026-04-17 21:24:38.385331614 +0000 UTC m=+689.586824491" watchObservedRunningTime="2026-04-17 21:24:38.38637247 +0000 UTC m=+689.587865343" Apr 17 21:24:38.597991 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.597967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:38.737936 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.737841 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsxvb\" (UniqueName: \"kubernetes.io/projected/bc4cf9bf-3355-4815-9686-e3f893f06d49-kube-api-access-jsxvb\") pod \"bc4cf9bf-3355-4815-9686-e3f893f06d49\" (UID: \"bc4cf9bf-3355-4815-9686-e3f893f06d49\") " Apr 17 21:24:38.740074 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.740036 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4cf9bf-3355-4815-9686-e3f893f06d49-kube-api-access-jsxvb" (OuterVolumeSpecName: "kube-api-access-jsxvb") pod "bc4cf9bf-3355-4815-9686-e3f893f06d49" (UID: "bc4cf9bf-3355-4815-9686-e3f893f06d49"). InnerVolumeSpecName "kube-api-access-jsxvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:24:38.838534 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:38.838498 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jsxvb\" (UniqueName: \"kubernetes.io/projected/bc4cf9bf-3355-4815-9686-e3f893f06d49-kube-api-access-jsxvb\") on node \"ip-10-0-138-36.ec2.internal\" DevicePath \"\"" Apr 17 21:24:39.375990 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.375954 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc4cf9bf-3355-4815-9686-e3f893f06d49" containerID="5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c" exitCode=0 Apr 17 21:24:39.376410 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.376011 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" Apr 17 21:24:39.376410 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.376018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" event={"ID":"bc4cf9bf-3355-4815-9686-e3f893f06d49","Type":"ContainerDied","Data":"5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c"} Apr 17 21:24:39.376410 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.376052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q6rc6" event={"ID":"bc4cf9bf-3355-4815-9686-e3f893f06d49","Type":"ContainerDied","Data":"c837c06f81dc7b570ed39496b31477a8fe72a9e372438320e0fea5e7ceed938c"} Apr 17 21:24:39.376410 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.376068 2576 scope.go:117] "RemoveContainer" containerID="5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c" Apr 17 21:24:39.387606 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.387579 2576 scope.go:117] "RemoveContainer" containerID="5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c" Apr 17 21:24:39.388886 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:24:39.388858 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c\": container with ID starting with 5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c not found: ID does not exist" containerID="5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c" Apr 17 21:24:39.389029 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.388902 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c"} err="failed to get container status \"5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c\": rpc error: code = NotFound desc = could not find container \"5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c\": container with ID starting with 5c849f0a2794ea8bc591517ee97faee7e4d533b90415e61829f9bac5e3b3030c not found: ID does not exist" Apr 17 21:24:39.404249 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.404220 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q6rc6"] Apr 17 21:24:39.409407 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:39.409377 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q6rc6"] Apr 17 21:24:41.383493 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:24:41.383460 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4cf9bf-3355-4815-9686-e3f893f06d49" path="/var/lib/kubelet/pods/bc4cf9bf-3355-4815-9686-e3f893f06d49/volumes" Apr 17 21:25:17.855634 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.855589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh"] Apr 17 21:25:17.856073 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.856056 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc4cf9bf-3355-4815-9686-e3f893f06d49" containerName="manager" Apr 17 21:25:17.856116 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.856078 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4cf9bf-3355-4815-9686-e3f893f06d49" containerName="manager" Apr 17 21:25:17.856168 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.856156 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc4cf9bf-3355-4815-9686-e3f893f06d49" containerName="manager" Apr 17 21:25:17.862877 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.862852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:17.865421 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.865390 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 21:25:17.865585 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.865448 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 21:25:17.866250 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.866228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-vmdkh\"" Apr 17 21:25:17.866379 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.866278 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 21:25:17.866447 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.866309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh"] Apr 17 21:25:17.945867 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.945829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:17.945867 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.945867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:17.946072 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.945901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clzx6\" (UniqueName: \"kubernetes.io/projected/817d58b6-39de-4984-8798-1e60af852ab7-kube-api-access-clzx6\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:17.946072 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.945976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817d58b6-39de-4984-8798-1e60af852ab7-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:17.946072 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.946016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:17.946215 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:17.946106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.046969 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.046921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047177 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.046995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047177 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047177 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clzx6\" (UniqueName: \"kubernetes.io/projected/817d58b6-39de-4984-8798-1e60af852ab7-kube-api-access-clzx6\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047339 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817d58b6-39de-4984-8798-1e60af852ab7-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047339 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047428 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047469 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.047546 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.047529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.049365 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.049343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/817d58b6-39de-4984-8798-1e60af852ab7-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.049546 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.049527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817d58b6-39de-4984-8798-1e60af852ab7-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.054442 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.054410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clzx6\" (UniqueName: \"kubernetes.io/projected/817d58b6-39de-4984-8798-1e60af852ab7-kube-api-access-clzx6\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-kg8rh\" (UID: \"817d58b6-39de-4984-8798-1e60af852ab7\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.173560 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.173469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:18.305742 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.305715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh"] Apr 17 21:25:18.308291 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:25:18.308263 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817d58b6_39de_4984_8798_1e60af852ab7.slice/crio-c833b49bab320b8055ddec45d4fe480fb50f36571449e973dc7e5b3701aa9607 WatchSource:0}: Error finding container c833b49bab320b8055ddec45d4fe480fb50f36571449e973dc7e5b3701aa9607: Status 404 returned error can't find the container with id c833b49bab320b8055ddec45d4fe480fb50f36571449e973dc7e5b3701aa9607 Apr 17 21:25:18.499492 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:18.499409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerStarted","Data":"c833b49bab320b8055ddec45d4fe480fb50f36571449e973dc7e5b3701aa9607"} Apr 17 21:25:23.112614 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.112563 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7fcbcc7f7-smt76"] Apr 17 21:25:23.164338 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.164300 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fcbcc7f7-smt76"] Apr 17 21:25:23.164527 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.164448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.167027 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.166873 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 21:25:23.167866 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.167839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 21:25:23.167984 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.167861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-c56s7\"" Apr 17 21:25:23.294997 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.294962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a6a18dac-c45f-4f71-81b2-a7eaf50652b7-maas-api-tls\") pod \"maas-api-7fcbcc7f7-smt76\" (UID: \"a6a18dac-c45f-4f71-81b2-a7eaf50652b7\") " pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.294997 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.295001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4c5r\" (UniqueName: \"kubernetes.io/projected/a6a18dac-c45f-4f71-81b2-a7eaf50652b7-kube-api-access-f4c5r\") pod \"maas-api-7fcbcc7f7-smt76\" (UID: \"a6a18dac-c45f-4f71-81b2-a7eaf50652b7\") " pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.396385 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.396310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a6a18dac-c45f-4f71-81b2-a7eaf50652b7-maas-api-tls\") pod \"maas-api-7fcbcc7f7-smt76\" (UID: \"a6a18dac-c45f-4f71-81b2-a7eaf50652b7\") " pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.396385 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.396347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4c5r\" (UniqueName: \"kubernetes.io/projected/a6a18dac-c45f-4f71-81b2-a7eaf50652b7-kube-api-access-f4c5r\") pod \"maas-api-7fcbcc7f7-smt76\" (UID: \"a6a18dac-c45f-4f71-81b2-a7eaf50652b7\") " pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.398971 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.398943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a6a18dac-c45f-4f71-81b2-a7eaf50652b7-maas-api-tls\") pod \"maas-api-7fcbcc7f7-smt76\" (UID: \"a6a18dac-c45f-4f71-81b2-a7eaf50652b7\") " pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.403932 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.403904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4c5r\" (UniqueName: \"kubernetes.io/projected/a6a18dac-c45f-4f71-81b2-a7eaf50652b7-kube-api-access-f4c5r\") pod \"maas-api-7fcbcc7f7-smt76\" (UID: \"a6a18dac-c45f-4f71-81b2-a7eaf50652b7\") " pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.477827 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.477804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:23.531863 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.531794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerStarted","Data":"406d148d648442ee6496b468681af89e4c8c11e704fb5b2585f5ef1bb0d2beb2"} Apr 17 21:25:23.613453 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:23.613422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fcbcc7f7-smt76"] Apr 17 21:25:23.622953 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:25:23.622920 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a18dac_c45f_4f71_81b2_a7eaf50652b7.slice/crio-48caff1e81c1493e4a3d1a918930a3fdbd82ed58b19b3ae0d4860f12c7494fa7 WatchSource:0}: Error finding container 48caff1e81c1493e4a3d1a918930a3fdbd82ed58b19b3ae0d4860f12c7494fa7: Status 404 returned error can't find the container with id 48caff1e81c1493e4a3d1a918930a3fdbd82ed58b19b3ae0d4860f12c7494fa7 Apr 17 21:25:24.537403 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:24.537357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fcbcc7f7-smt76" event={"ID":"a6a18dac-c45f-4f71-81b2-a7eaf50652b7","Type":"ContainerStarted","Data":"48caff1e81c1493e4a3d1a918930a3fdbd82ed58b19b3ae0d4860f12c7494fa7"} Apr 17 21:25:25.542746 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:25.542709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fcbcc7f7-smt76" event={"ID":"a6a18dac-c45f-4f71-81b2-a7eaf50652b7","Type":"ContainerStarted","Data":"bb82b83080afc53f28ddd33a47d89b51e61d1d3e7bd5b92c654b3881db14d788"} Apr 17 21:25:25.543134 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:25.542823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:25.558873 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:25.558826 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7fcbcc7f7-smt76" podStartSLOduration=0.848974875 podStartE2EDuration="2.558812151s" podCreationTimestamp="2026-04-17 21:25:23 +0000 UTC" firstStartedPulling="2026-04-17 21:25:23.62443124 +0000 UTC m=+734.825924092" lastFinishedPulling="2026-04-17 21:25:25.334268512 +0000 UTC m=+736.535761368" observedRunningTime="2026-04-17 21:25:25.556843594 +0000 UTC m=+736.758336468" watchObservedRunningTime="2026-04-17 21:25:25.558812151 +0000 UTC m=+736.760305025" Apr 17 21:25:31.552198 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:31.552164 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7fcbcc7f7-smt76" Apr 17 21:25:32.566685 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:32.566630 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="406d148d648442ee6496b468681af89e4c8c11e704fb5b2585f5ef1bb0d2beb2" exitCode=0 Apr 17 21:25:32.567063 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:32.566709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"406d148d648442ee6496b468681af89e4c8c11e704fb5b2585f5ef1bb0d2beb2"} Apr 17 21:25:34.575491 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:34.575459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/0.log" Apr 17 21:25:34.575952 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:34.575772 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="52d522c736cb9ed0871e77c2bcb5f511930476e1c6d79854b0d689f571ce4260" exitCode=2 Apr 17 21:25:34.575952 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:34.575858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"52d522c736cb9ed0871e77c2bcb5f511930476e1c6d79854b0d689f571ce4260"} Apr 17 21:25:34.576258 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:34.576241 2576 scope.go:117] "RemoveContainer" containerID="52d522c736cb9ed0871e77c2bcb5f511930476e1c6d79854b0d689f571ce4260" Apr 17 21:25:35.580795 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:35.580764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/1.log" Apr 17 21:25:35.581207 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:35.581133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/0.log" Apr 17 21:25:35.581449 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:35.581428 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="910ee8e7803d7cbe880fd038da8a3acd08deeb2db5af105ef0be7c42100a1efa" exitCode=2 Apr 17 21:25:35.581503 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:35.581489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"910ee8e7803d7cbe880fd038da8a3acd08deeb2db5af105ef0be7c42100a1efa"} Apr 17 21:25:35.581539 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:35.581527 2576 scope.go:117] "RemoveContainer" containerID="52d522c736cb9ed0871e77c2bcb5f511930476e1c6d79854b0d689f571ce4260" Apr 17 21:25:35.582003 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:35.581981 2576 scope.go:117] "RemoveContainer" containerID="910ee8e7803d7cbe880fd038da8a3acd08deeb2db5af105ef0be7c42100a1efa" Apr 17 21:25:35.582226 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:25:35.582208 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:25:36.587232 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:36.587203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/1.log" Apr 17 21:25:38.173849 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:38.173801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:38.173849 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:38.173850 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:38.174280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:38.174220 2576 scope.go:117] "RemoveContainer" containerID="910ee8e7803d7cbe880fd038da8a3acd08deeb2db5af105ef0be7c42100a1efa" Apr 17 21:25:38.174404 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:25:38.174386 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:25:39.655104 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.655067 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4"] Apr 17 21:25:39.699826 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.699782 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4"] Apr 17 21:25:39.700044 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.699913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.702178 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.702154 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 21:25:39.843304 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.843266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.843304 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.843303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.843517 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.843383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bffdbbd-62e6-4462-977a-ddae82e5abc8-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.843517 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.843427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kfc\" (UniqueName: \"kubernetes.io/projected/7bffdbbd-62e6-4462-977a-ddae82e5abc8-kube-api-access-r5kfc\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.843517 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.843488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.843517 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.843510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944025 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.943940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944025 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.943983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944246 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bffdbbd-62e6-4462-977a-ddae82e5abc8-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944246 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kfc\" (UniqueName: \"kubernetes.io/projected/7bffdbbd-62e6-4462-977a-ddae82e5abc8-kube-api-access-r5kfc\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944246 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944246 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944499 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-model-cache\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944553 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-home\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.944619 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.944599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.946337 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.946316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bffdbbd-62e6-4462-977a-ddae82e5abc8-dshm\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.946459 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.946441 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bffdbbd-62e6-4462-977a-ddae82e5abc8-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:39.951986 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:39.951963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kfc\" (UniqueName: \"kubernetes.io/projected/7bffdbbd-62e6-4462-977a-ddae82e5abc8-kube-api-access-r5kfc\") pod \"e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4\" (UID: \"7bffdbbd-62e6-4462-977a-ddae82e5abc8\") " pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:40.010453 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:40.010410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:40.142224 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:40.142184 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4"] Apr 17 21:25:40.145112 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:25:40.145082 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bffdbbd_62e6_4462_977a_ddae82e5abc8.slice/crio-80ee72379c242842e7765d2e7bb3d1db3d4d19604d523baedea75a82b505d5ea WatchSource:0}: Error finding container 80ee72379c242842e7765d2e7bb3d1db3d4d19604d523baedea75a82b505d5ea: Status 404 returned error can't find the container with id 80ee72379c242842e7765d2e7bb3d1db3d4d19604d523baedea75a82b505d5ea Apr 17 21:25:40.147152 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:40.147134 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:25:40.602822 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:40.602786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerStarted","Data":"323abbb635f38262a0b5300a7af956c342e2c8712ace13fb5ac79e1ca776a7be"} Apr 17 21:25:40.602822 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:40.602825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerStarted","Data":"80ee72379c242842e7765d2e7bb3d1db3d4d19604d523baedea75a82b505d5ea"} Apr 17 21:25:46.634056 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:46.634019 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="323abbb635f38262a0b5300a7af956c342e2c8712ace13fb5ac79e1ca776a7be" exitCode=0 Apr 17 21:25:46.634489 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:46.634070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"323abbb635f38262a0b5300a7af956c342e2c8712ace13fb5ac79e1ca776a7be"} Apr 17 21:25:47.639032 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:47.639007 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/0.log" Apr 17 21:25:47.639431 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:47.639307 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="c9c869284d06c4e75e9ae871178cb5a22f761c2913a5d9a3e6d9906efff6e01b" exitCode=2 Apr 17 21:25:47.639431 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:47.639344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"c9c869284d06c4e75e9ae871178cb5a22f761c2913a5d9a3e6d9906efff6e01b"} Apr 17 21:25:47.639724 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:47.639709 2576 scope.go:117] "RemoveContainer" containerID="c9c869284d06c4e75e9ae871178cb5a22f761c2913a5d9a3e6d9906efff6e01b" Apr 17 21:25:48.644996 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:48.644968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/1.log" Apr 17 21:25:48.645395 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:48.645375 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/0.log" Apr 17 21:25:48.645748 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:48.645714 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="347c852bd499b290f70f184876bdbfd4f375d7df3c7fc0acd14000947a16d856" exitCode=2 Apr 17 21:25:48.645852 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:48.645774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"347c852bd499b290f70f184876bdbfd4f375d7df3c7fc0acd14000947a16d856"} Apr 17 21:25:48.645852 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:48.645831 2576 scope.go:117] "RemoveContainer" containerID="c9c869284d06c4e75e9ae871178cb5a22f761c2913a5d9a3e6d9906efff6e01b" Apr 17 21:25:48.646288 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:48.646271 2576 scope.go:117] "RemoveContainer" containerID="347c852bd499b290f70f184876bdbfd4f375d7df3c7fc0acd14000947a16d856" Apr 17 21:25:48.646496 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:25:48.646479 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:25:49.650488 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:49.650463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/1.log" Apr 17 21:25:50.011461 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:50.011381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:50.011461 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:50.011422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:25:50.011926 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:50.011908 2576 scope.go:117] "RemoveContainer" containerID="347c852bd499b290f70f184876bdbfd4f375d7df3c7fc0acd14000947a16d856" Apr 17 21:25:50.012138 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:25:50.012118 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:25:51.380084 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:51.380049 2576 scope.go:117] "RemoveContainer" containerID="910ee8e7803d7cbe880fd038da8a3acd08deeb2db5af105ef0be7c42100a1efa" Apr 17 21:25:52.663462 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:52.663436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/2.log" Apr 17 21:25:52.663885 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:52.663823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/1.log" Apr 17 21:25:52.664147 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:52.664125 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="4aa4e1eec30170faa94e12f7903eafbdbf2021815b585e017d9a0b75460a8e71" exitCode=2 Apr 17 21:25:52.664222 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:52.664161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"4aa4e1eec30170faa94e12f7903eafbdbf2021815b585e017d9a0b75460a8e71"} Apr 17 21:25:52.664222 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:52.664197 2576 scope.go:117] "RemoveContainer" containerID="910ee8e7803d7cbe880fd038da8a3acd08deeb2db5af105ef0be7c42100a1efa" Apr 17 21:25:52.664605 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:52.664588 2576 scope.go:117] "RemoveContainer" containerID="4aa4e1eec30170faa94e12f7903eafbdbf2021815b585e017d9a0b75460a8e71" Apr 17 21:25:52.664839 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:25:52.664818 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:25:53.668583 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:53.668557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/2.log" Apr 17 21:25:58.174133 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:58.174092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:58.174133 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:58.174138 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:25:58.174555 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:25:58.174544 2576 scope.go:117] "RemoveContainer" containerID="4aa4e1eec30170faa94e12f7903eafbdbf2021815b585e017d9a0b75460a8e71" Apr 17 21:25:58.174766 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:25:58.174746 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:26:00.380307 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.380274 2576 scope.go:117] "RemoveContainer" containerID="347c852bd499b290f70f184876bdbfd4f375d7df3c7fc0acd14000947a16d856" Apr 17 21:26:00.692582 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.692553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/2.log" Apr 17 21:26:00.692945 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.692927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/1.log" Apr 17 21:26:00.693278 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.693256 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="a4b1d5629c845507de105da724b7b25c9f7387462f1725b246df364eb1446e1d" exitCode=2 Apr 17 21:26:00.693344 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.693327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"a4b1d5629c845507de105da724b7b25c9f7387462f1725b246df364eb1446e1d"} Apr 17 21:26:00.693381 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.693366 2576 scope.go:117] "RemoveContainer" containerID="347c852bd499b290f70f184876bdbfd4f375d7df3c7fc0acd14000947a16d856" Apr 17 21:26:00.693758 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:00.693741 2576 scope.go:117] "RemoveContainer" containerID="a4b1d5629c845507de105da724b7b25c9f7387462f1725b246df364eb1446e1d" Apr 17 21:26:00.693937 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:00.693916 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:26:01.697614 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:01.697585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/2.log" Apr 17 21:26:10.010996 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:10.010956 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:26:10.010996 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:10.010996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:26:10.011568 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:10.011517 2576 scope.go:117] "RemoveContainer" containerID="a4b1d5629c845507de105da724b7b25c9f7387462f1725b246df364eb1446e1d" Apr 17 21:26:10.011791 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:10.011769 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:26:12.379812 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.379781 2576 scope.go:117] "RemoveContainer" containerID="4aa4e1eec30170faa94e12f7903eafbdbf2021815b585e017d9a0b75460a8e71" Apr 17 21:26:12.734785 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.734758 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/3.log" Apr 17 21:26:12.735135 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.735118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/2.log" Apr 17 21:26:12.735423 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.735401 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" exitCode=2 Apr 17 21:26:12.735515 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.735437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c"} Apr 17 21:26:12.735515 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.735475 2576 scope.go:117] "RemoveContainer" containerID="4aa4e1eec30170faa94e12f7903eafbdbf2021815b585e017d9a0b75460a8e71" Apr 17 21:26:12.735886 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:12.735857 2576 scope.go:117] "RemoveContainer" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" Apr 17 21:26:12.736058 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:12.736037 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:26:13.740464 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:13.740438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/3.log" Apr 17 21:26:18.174017 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:18.173977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:26:18.174017 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:18.174014 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:26:18.174517 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:18.174409 2576 scope.go:117] "RemoveContainer" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" Apr 17 21:26:18.174627 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:18.174609 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:26:22.379583 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.379548 2576 scope.go:117] "RemoveContainer" containerID="a4b1d5629c845507de105da724b7b25c9f7387462f1725b246df364eb1446e1d" Apr 17 21:26:22.771187 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.771162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/3.log" Apr 17 21:26:22.771498 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.771482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/2.log" Apr 17 21:26:22.771803 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.771780 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" exitCode=2 Apr 17 21:26:22.771910 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.771856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2"} Apr 17 21:26:22.771910 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.771894 2576 scope.go:117] "RemoveContainer" containerID="a4b1d5629c845507de105da724b7b25c9f7387462f1725b246df364eb1446e1d" Apr 17 21:26:22.772360 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:22.772342 2576 scope.go:117] "RemoveContainer" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" Apr 17 21:26:22.772620 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:22.772600 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:26:23.776348 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:23.776323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/3.log" Apr 17 21:26:30.011171 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:30.011065 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:26:30.011171 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:30.011108 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:26:30.011756 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:30.011703 2576 scope.go:117] "RemoveContainer" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" Apr 17 21:26:30.011952 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:30.011930 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:26:32.379980 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:32.379933 2576 scope.go:117] "RemoveContainer" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" Apr 17 21:26:32.380400 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:32.380129 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:26:42.380166 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:42.380137 2576 scope.go:117] "RemoveContainer" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" Apr 17 21:26:42.380576 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:42.380319 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:26:47.379928 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:47.379896 2576 scope.go:117] "RemoveContainer" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" Apr 17 21:26:47.380308 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:47.380093 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:26:56.379967 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:56.379932 2576 scope.go:117] "RemoveContainer" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" Apr 17 21:26:56.380459 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:56.380096 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:26:58.380201 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.380165 2576 scope.go:117] "RemoveContainer" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" Apr 17 21:26:58.891488 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.891458 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/4.log" Apr 17 21:26:58.891842 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.891826 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/3.log" Apr 17 21:26:58.892119 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.892100 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" exitCode=2 Apr 17 21:26:58.892192 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.892175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957"} Apr 17 21:26:58.892230 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.892214 2576 scope.go:117] "RemoveContainer" containerID="317e5f168d6bdbdfa925e480aacc4e85276f7ff5433d69b85e84009ed469292c" Apr 17 21:26:58.892590 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:58.892571 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:26:58.892830 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:26:58.892805 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:26:59.896991 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:26:59.896964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/4.log" Apr 17 21:27:08.173757 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:08.173721 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:27:08.173757 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:08.173756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:27:08.174319 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:08.174154 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:27:08.174367 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:08.174336 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:27:10.380413 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.380378 2576 scope.go:117] "RemoveContainer" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" Apr 17 21:27:10.937916 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.937840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/4.log" Apr 17 21:27:10.938221 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.938205 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/3.log" Apr 17 21:27:10.938501 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.938480 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" exitCode=2 Apr 17 21:27:10.938580 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.938557 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d"} Apr 17 21:27:10.938617 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.938604 2576 scope.go:117] "RemoveContainer" containerID="574a60f448ffa5f02804b4cc7e20460909c7eabb0e4fbd628fe60e3eca6598e2" Apr 17 21:27:10.939037 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:10.939020 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:27:10.939256 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:10.939239 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:27:11.943202 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:11.943176 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/4.log" Apr 17 21:27:20.011153 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:20.011102 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:27:20.011153 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:20.011153 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:27:20.011606 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:20.011571 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:27:20.011817 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:20.011798 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:27:21.384738 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:21.384709 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:27:21.385109 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:21.384868 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:27:31.380336 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:31.380305 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:27:31.380740 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:31.380476 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:27:35.380663 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:35.380630 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:27:35.381054 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:35.380845 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:27:43.380157 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:43.380124 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:27:43.380614 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:43.380293 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:27:48.380451 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:48.380417 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:27:48.380872 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:48.380685 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:27:57.380487 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:27:57.380455 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:27:57.380920 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:27:57.380645 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:28:03.380561 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:03.380475 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:28:03.381028 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:03.380683 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:28:08.379963 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:08.379922 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:28:08.380458 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:08.380151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:28:09.292238 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:09.292206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/4.log" Apr 17 21:28:09.293172 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:09.293148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/4.log" Apr 17 21:28:09.293280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:09.293154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/4.log" Apr 17 21:28:09.294025 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:09.294006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/4.log" Apr 17 21:28:17.380040 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:17.380005 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:28:17.380531 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:17.380213 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:28:19.385859 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:19.385828 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:28:19.386239 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:19.386010 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:28:30.379857 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:30.379820 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:28:30.380288 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:30.380017 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:28:31.380789 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:31.380750 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:28:32.204320 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:32.204293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/5.log" Apr 17 21:28:32.204744 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:32.204729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/4.log" Apr 17 21:28:32.205049 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:32.205027 2576 generic.go:358] "Generic (PLEG): container finished" podID="817d58b6-39de-4984-8798-1e60af852ab7" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" exitCode=2 Apr 17 21:28:32.205128 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:32.205098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" event={"ID":"817d58b6-39de-4984-8798-1e60af852ab7","Type":"ContainerDied","Data":"6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b"} Apr 17 21:28:32.205171 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:32.205144 2576 scope.go:117] "RemoveContainer" containerID="6a4f6169a9fa7d364b7b56d03cf71074cadac1bf52fa5c25a9d3eec65b01a957" Apr 17 21:28:32.205538 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:32.205522 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:28:32.205744 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:32.205727 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:28:33.209413 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:33.209386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/5.log" Apr 17 21:28:38.173737 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:38.173697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:28:38.173737 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:38.173735 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" Apr 17 21:28:38.174181 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:38.174140 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:28:38.174338 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:38.174321 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:28:45.380611 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:45.380573 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:28:46.257106 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:46.257075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/5.log" Apr 17 21:28:46.257461 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:46.257445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/4.log" Apr 17 21:28:46.257783 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:46.257760 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" exitCode=2 Apr 17 21:28:46.257885 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:46.257829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" event={"ID":"7bffdbbd-62e6-4462-977a-ddae82e5abc8","Type":"ContainerDied","Data":"da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e"} Apr 17 21:28:46.257885 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:46.257866 2576 scope.go:117] "RemoveContainer" containerID="ef37a2a76e95740b540b522fd0f0f17533065f13cea21a56c9c5b54a4130af6d" Apr 17 21:28:46.258272 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:46.258256 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:28:46.258475 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:46.258449 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:28:47.263970 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:47.263937 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/5.log" Apr 17 21:28:49.382268 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:49.382243 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:28:49.382802 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:49.382427 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:28:50.010960 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:50.010921 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:28:50.010960 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:50.010961 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" Apr 17 21:28:50.011374 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:28:50.011361 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:28:50.011556 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:28:50.011539 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:29:03.379990 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:03.379953 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:29:03.382501 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:03.380176 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:29:04.379565 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:04.379529 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:29:04.379770 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:04.379742 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:29:15.380128 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:15.380098 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:29:15.380512 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:15.380292 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:29:17.379598 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:17.379568 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:29:17.380004 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:17.379764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:29:27.379770 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:27.379737 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:29:27.380248 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:27.379911 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:29:30.379562 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:30.379487 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:29:30.379943 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:30.379692 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:29:39.385051 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:39.385021 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:29:39.385447 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:39.385197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:29:42.380070 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:42.380037 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:29:42.380468 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:42.380223 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:29:54.380303 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:54.380266 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:29:54.380813 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:54.380473 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:29:57.384549 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:29:57.384521 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:29:57.384938 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:29:57.384731 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:30:06.379803 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:06.379771 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:30:06.380266 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:06.379978 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:30:12.380733 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:12.380692 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:30:12.381119 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:12.380897 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:30:15.160598 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:15.160555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-d7dzj_a8c31f98-fb3d-4210-a0d8-cad1450fb582/manager/0.log" Apr 17 21:30:15.283511 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:15.283460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7fcbcc7f7-smt76_a6a18dac-c45f-4f71-81b2-a7eaf50652b7/maas-api/0.log" Apr 17 21:30:15.651405 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:15.651366 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-qjchn_c12a37a5-5ddd-4c3f-b3db-3f1d14abc307/manager/0.log" Apr 17 21:30:15.898100 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:15.898061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-694fdf7c65-tzlqn_335e1c2e-d681-45f4-a58f-ead32b0515c7/manager/0.log" Apr 17 21:30:17.384169 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:17.384145 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:30:17.384554 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:17.384321 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:30:17.541395 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:17.541340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-l9467_1cbc34c7-b672-4646-9831-6f4eda242e7b/manager/0.log" Apr 17 21:30:17.654138 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:17.654035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-hgjcs_d240f8bc-a3db-485f-abdb-9a9305b4aa8f/manager/0.log" Apr 17 21:30:18.246760 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:18.246726 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-92bv7_8312822f-a2ce-4aec-a7c8-96ec612fb886/manager/0.log" Apr 17 21:30:18.709176 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:18.709141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-95drp_b68c0e20-2a48-4332-b51b-3c2102fe7149/discovery/0.log" Apr 17 21:30:18.932048 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:18.932018 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56dddbd4f7-pfk7h_1fb263c3-60ed-4d7a-83a0-c7828118a401/kube-auth-proxy/0.log" Apr 17 21:30:19.554699 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:19.554643 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/storage-initializer/0.log" Apr 17 21:30:19.561358 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:19.561332 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_7bffdbbd-62e6-4462-977a-ddae82e5abc8/main/5.log" Apr 17 21:30:20.193645 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:20.193613 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/storage-initializer/0.log" Apr 17 21:30:20.200195 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:20.200169 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_817d58b6-39de-4984-8798-1e60af852ab7/main/5.log" Apr 17 21:30:25.380455 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:25.380423 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:30:25.380863 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:25.380624 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:30:26.666719 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:26.666686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-txkcf_2f888011-8182-4d16-8a26-05b9e1670eb0/global-pull-secret-syncer/0.log" Apr 17 21:30:26.813311 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:26.813283 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xsg6r_e155af39-2618-4164-82af-86a051e4a586/konnectivity-agent/0.log" Apr 17 21:30:26.873183 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:26.873151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-36.ec2.internal_a2b7385231aa0ba2edc60e303dfabae0/haproxy/0.log" Apr 17 21:30:31.281323 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:31.281288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-l9467_1cbc34c7-b672-4646-9831-6f4eda242e7b/manager/0.log" Apr 17 21:30:31.316368 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:31.316335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-hgjcs_d240f8bc-a3db-485f-abdb-9a9305b4aa8f/manager/0.log" Apr 17 21:30:31.479459 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:31.479430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-92bv7_8312822f-a2ce-4aec-a7c8-96ec612fb886/manager/0.log" Apr 17 21:30:32.380128 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:32.380099 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:30:32.380529 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:32.380278 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:30:33.397895 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:33.397861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4gtnq_c494d1b2-41ce-4d6d-9180-64145606ffcb/node-exporter/0.log" Apr 17 21:30:33.423374 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:33.423340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4gtnq_c494d1b2-41ce-4d6d-9180-64145606ffcb/kube-rbac-proxy/0.log" Apr 17 21:30:33.446986 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:33.446958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4gtnq_c494d1b2-41ce-4d6d-9180-64145606ffcb/init-textfile/0.log" Apr 17 21:30:35.072635 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.072602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zsm2v_f81f53a6-485f-4654-b5dd-6b5d5f5784c8/networking-console-plugin/0.log" Apr 17 21:30:35.214975 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.214940 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c"] Apr 17 21:30:35.218254 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.218228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.220539 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.220509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w64ck\"/\"openshift-service-ca.crt\"" Apr 17 21:30:35.220698 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.220540 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w64ck\"/\"kube-root-ca.crt\"" Apr 17 21:30:35.221445 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.221427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-w64ck\"/\"default-dockercfg-fwnkt\"" Apr 17 21:30:35.226635 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.226609 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c"] Apr 17 21:30:35.278331 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.278288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-sys\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.278331 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.278332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-lib-modules\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.278547 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.278379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-proc\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.278547 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.278394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kxhn\" (UniqueName: \"kubernetes.io/projected/197db0e8-8984-4abd-af2e-f065bf674bf7-kube-api-access-7kxhn\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.278547 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.278481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-podres\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.379992 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.379955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-podres\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380166 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-sys\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380166 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-lib-modules\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380166 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-sys\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-proc\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-lib-modules\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-proc\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/197db0e8-8984-4abd-af2e-f065bf674bf7-podres\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.380280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.380201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kxhn\" (UniqueName: \"kubernetes.io/projected/197db0e8-8984-4abd-af2e-f065bf674bf7-kube-api-access-7kxhn\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.387720 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.387696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kxhn\" (UniqueName: \"kubernetes.io/projected/197db0e8-8984-4abd-af2e-f065bf674bf7-kube-api-access-7kxhn\") pod \"perf-node-gather-daemonset-j7q2c\" (UID: \"197db0e8-8984-4abd-af2e-f065bf674bf7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.529539 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.529493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:35.650206 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:35.650129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c"] Apr 17 21:30:35.653303 ip-10-0-138-36 kubenswrapper[2576]: W0417 21:30:35.653268 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod197db0e8_8984_4abd_af2e_f065bf674bf7.slice/crio-57d1f5fa910527b4e88a249e1975ffb6e11032b1fa4e86483b89ba1a53087df9 WatchSource:0}: Error finding container 57d1f5fa910527b4e88a249e1975ffb6e11032b1fa4e86483b89ba1a53087df9: Status 404 returned error can't find the container with id 57d1f5fa910527b4e88a249e1975ffb6e11032b1fa4e86483b89ba1a53087df9 Apr 17 21:30:36.629869 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:36.629838 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" event={"ID":"197db0e8-8984-4abd-af2e-f065bf674bf7","Type":"ContainerStarted","Data":"bd60268ea13820e5fc4f0e4c7e82a094fa7ed0afba60c82ecfbd8a0e70ac5372"} Apr 17 21:30:36.629869 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:36.629873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" event={"ID":"197db0e8-8984-4abd-af2e-f065bf674bf7","Type":"ContainerStarted","Data":"57d1f5fa910527b4e88a249e1975ffb6e11032b1fa4e86483b89ba1a53087df9"} Apr 17 21:30:36.630298 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:36.629976 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:36.644993 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:36.644933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" podStartSLOduration=1.644912654 podStartE2EDuration="1.644912654s" podCreationTimestamp="2026-04-17 21:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:30:36.644050227 +0000 UTC m=+1047.845543102" watchObservedRunningTime="2026-04-17 21:30:36.644912654 +0000 UTC m=+1047.846405531" Apr 17 21:30:37.547289 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:37.547252 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4vxtr_8487d59d-e14f-48c8-bfd1-045074db5610/dns/0.log" Apr 17 21:30:37.569974 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:37.569934 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4vxtr_8487d59d-e14f-48c8-bfd1-045074db5610/kube-rbac-proxy/0.log" Apr 17 21:30:37.712160 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:37.712132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dj96t_73df3c3b-340e-459f-a30d-51085c37c69b/dns-node-resolver/0.log" Apr 17 21:30:38.265301 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:38.265276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8bvhr_becd6e32-bdde-40bf-bef7-c2f14ff29b2b/node-ca/0.log" Apr 17 21:30:39.190035 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:39.190001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-95drp_b68c0e20-2a48-4332-b51b-3c2102fe7149/discovery/0.log" Apr 17 21:30:39.228520 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:39.228492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56dddbd4f7-pfk7h_1fb263c3-60ed-4d7a-83a0-c7828118a401/kube-auth-proxy/0.log" Apr 17 21:30:39.381581 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:39.381549 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:30:39.381796 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:39.381772 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:30:39.821120 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:39.821087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mnf4b_99e3f7d8-5975-4eb1-af99-b5c1cd2c0333/serve-healthcheck-canary/0.log" Apr 17 21:30:40.489410 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:40.489381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cx5kv_35f8b229-3626-445e-b9f4-d73400ef233a/kube-rbac-proxy/0.log" Apr 17 21:30:40.535044 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:40.535001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cx5kv_35f8b229-3626-445e-b9f4-d73400ef233a/exporter/0.log" Apr 17 21:30:40.573309 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:40.573270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cx5kv_35f8b229-3626-445e-b9f4-d73400ef233a/extractor/0.log" Apr 17 21:30:42.644067 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:42.644038 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-j7q2c" Apr 17 21:30:42.664133 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:42.664100 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-d7dzj_a8c31f98-fb3d-4210-a0d8-cad1450fb582/manager/0.log" Apr 17 21:30:42.685874 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:42.685841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7fcbcc7f7-smt76_a6a18dac-c45f-4f71-81b2-a7eaf50652b7/maas-api/0.log" Apr 17 21:30:42.762354 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:42.762323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-qjchn_c12a37a5-5ddd-4c3f-b3db-3f1d14abc307/manager/0.log" Apr 17 21:30:42.808212 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:42.808185 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-694fdf7c65-tzlqn_335e1c2e-d681-45f4-a58f-ead32b0515c7/manager/0.log" Apr 17 21:30:43.379887 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:43.379858 2576 scope.go:117] "RemoveContainer" containerID="da6b08227988e2636e4f2eb29cf94e5843b9ae805c50599e0c67b5fc34c25b7e" Apr 17 21:30:43.380072 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:43.380035 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4_llm(7bffdbbd-62e6-4462-977a-ddae82e5abc8)\"" pod="llm/e2e-distinct-2-simulated-kserve-769bbfb9db-5f4m4" podUID="7bffdbbd-62e6-4462-977a-ddae82e5abc8" Apr 17 21:30:44.241862 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:44.241828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-k8bwb_6b6afc13-c81b-4dd2-90c2-699a2fad97dd/openshift-lws-operator/0.log" Apr 17 21:30:49.870429 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.870399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/kube-multus-additional-cni-plugins/0.log" Apr 17 21:30:49.891280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.891253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/egress-router-binary-copy/0.log" Apr 17 21:30:49.910165 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.910133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/cni-plugins/0.log" Apr 17 21:30:49.929774 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.929737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/bond-cni-plugin/0.log" Apr 17 21:30:49.948597 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.948565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/routeoverride-cni/0.log" Apr 17 21:30:49.968513 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.968479 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/whereabouts-cni-bincopy/0.log" Apr 17 21:30:49.988280 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:49.988256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fnlrt_c8698539-048e-4326-92f2-2a5997c36c34/whereabouts-cni/0.log" Apr 17 21:30:50.307090 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:50.306996 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mjf2v_f845aa7c-34f0-4456-ae46-79b75dee87d0/kube-multus/0.log" Apr 17 21:30:50.380529 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:50.380504 2576 scope.go:117] "RemoveContainer" containerID="6db04cbbae701d3f7433158eaf089165a2eb87f96172c0460e805402e27b869b" Apr 17 21:30:50.380706 ip-10-0-138-36 kubenswrapper[2576]: E0417 21:30:50.380645 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-kg8rh_llm(817d58b6-39de-4984-8798-1e60af852ab7)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-kg8rh" podUID="817d58b6-39de-4984-8798-1e60af852ab7" Apr 17 21:30:50.472344 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:50.472312 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rcbth_8a39439a-b5a0-4399-975e-838c219449b7/network-metrics-daemon/0.log" Apr 17 21:30:50.491395 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:50.491366 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rcbth_8a39439a-b5a0-4399-975e-838c219449b7/kube-rbac-proxy/0.log" Apr 17 21:30:51.957578 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:51.957542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/ovn-controller/0.log" Apr 17 21:30:51.983190 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:51.983152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/ovn-acl-logging/0.log" Apr 17 21:30:52.004022 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:52.003999 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/kube-rbac-proxy-node/0.log" Apr 17 21:30:52.024094 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:52.024062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:30:52.040024 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:52.039993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/northd/0.log" Apr 17 21:30:52.058933 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:52.058907 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/nbdb/0.log" Apr 17 21:30:52.082338 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:52.082298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/sbdb/0.log" Apr 17 21:30:52.227718 ip-10-0-138-36 kubenswrapper[2576]: I0417 21:30:52.227617 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wvwmf_10a9acf7-61cb-4537-a133-e83e8426fd8f/ovnkube-controller/0.log"