Apr 17 20:12:43.830404 ip-10-0-129-50 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:12:43.830415 ip-10-0-129-50 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:12:43.830422 ip-10-0-129-50 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:12:43.830653 ip-10-0-129-50 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:12:53.965818 ip-10-0-129-50 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:12:53.965833 ip-10-0-129-50 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0c7844b9e9f44a11b46e4c504e5d572f -- Apr 17 20:15:06.716274 ip-10-0-129-50 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:15:07.122003 ip-10-0-129-50 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:07.122003 ip-10-0-129-50 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:15:07.122003 ip-10-0-129-50 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:07.122003 ip-10-0-129-50 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:15:07.122003 ip-10-0-129-50 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:15:07.123591 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.123493 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:15:07.128233 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128209 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:07.128233 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128230 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:07.128233 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128234 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:07.128233 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128238 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:07.128233 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128242 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128245 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128249 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128251 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128254 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128257 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128260 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128263 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128266 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128269 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128272 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128274 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128277 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128280 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128282 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128285 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128295 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128298 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128300 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128303 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:07.128430 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128305 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128308 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128310 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128313 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128316 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128318 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128321 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128324 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128326 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128329 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128332 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128334 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128337 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128339 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128343 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128347 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128349 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128352 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128355 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128357 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:07.129008 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128360 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128369 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128374 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128379 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128382 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128386 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128389 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128392 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128395 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128398 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128400 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128403 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128406 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128408 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128411 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128413 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128415 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128418 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128420 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:07.129539 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128423 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128425 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128428 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128430 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128433 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128436 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128438 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128442 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128445 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128448 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128451 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128453 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128456 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128458 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128461 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128471 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128474 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128476 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128480 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:07.130028 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128483 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128486 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128488 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128491 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128939 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128945 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128949 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128952 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128954 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128957 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128960 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128962 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128965 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128967 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128970 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128973 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128975 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128978 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128980 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:07.130483 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128983 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128986 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128989 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128992 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128995 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.128997 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129000 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129002 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129005 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129009 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129012 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129015 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129018 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129021 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129023 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129026 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129029 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129031 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129034 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129037 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:07.130950 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129040 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129042 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129045 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129047 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129050 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129052 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129055 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129057 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129060 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129062 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129065 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129068 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129071 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129074 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129076 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129079 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129082 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129084 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129087 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:07.131448 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129089 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129092 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129095 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129099 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129102 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129104 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129108 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129112 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129115 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129118 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129121 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129123 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129126 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129129 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129131 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129134 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129136 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129139 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129141 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129144 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:07.131940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129146 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129149 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129151 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129154 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129157 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129162 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129167 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129170 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129173 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129176 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129180 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.129183 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130292 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130305 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130314 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130319 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130324 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130328 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130333 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130338 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130341 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:15:07.132431 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130345 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130348 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130352 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130356 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130359 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130362 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130365 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130368 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130371 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130374 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130379 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130382 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130386 2571 flags.go:64] FLAG: --config-dir="" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130388 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130392 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130396 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130401 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130404 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130408 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130411 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130414 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130417 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130421 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130424 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130428 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:15:07.132964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130431 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130434 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130438 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130441 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130444 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130449 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130452 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130456 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130459 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130462 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130466 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130469 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130473 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130476 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130479 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130482 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130485 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130489 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130492 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130495 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130501 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130504 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130508 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130511 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130514 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130518 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:15:07.133595 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130521 2571 flags.go:64] FLAG: --help="false" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130524 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130527 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130530 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130533 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130537 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130541 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130544 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130546 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130550 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130555 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130558 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130561 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130564 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130568 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130571 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130574 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130577 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130580 2571 flags.go:64] FLAG: --lock-file="" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130583 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130586 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130590 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130596 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:15:07.134254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130598 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130601 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130604 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130609 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130612 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130615 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130618 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130628 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130631 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130635 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130638 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130642 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130645 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130649 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130652 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130655 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130658 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130667 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130670 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130673 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130677 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130680 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130686 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130689 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:15:07.134847 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130692 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130695 2571 flags.go:64] FLAG: --port="10250" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130698 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130701 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05a869ee187c83a46" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130704 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130707 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130710 2571 flags.go:64] FLAG: --register-node="true" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130713 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130716 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130720 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130723 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130726 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130730 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130734 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130738 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130741 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130744 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130747 2571 flags.go:64] FLAG: --runonce="false" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130751 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130754 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130757 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130760 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130763 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130767 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130770 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130773 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:15:07.135451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130776 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130779 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130782 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130785 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130789 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130792 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130795 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130801 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130803 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130806 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130812 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130815 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130818 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130821 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130824 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130827 2571 flags.go:64] FLAG: --v="2" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130831 2571 flags.go:64] FLAG: --version="false" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130836 2571 flags.go:64] FLAG: --vmodule="" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130842 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.130845 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130965 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130970 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130973 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130975 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:07.136111 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130978 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130981 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130983 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130986 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130989 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130992 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130995 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.130997 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131000 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131002 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131005 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131008 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131012 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131014 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131017 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131020 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131023 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131025 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131029 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131033 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:07.136686 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131037 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131040 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131043 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131046 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131049 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131051 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131055 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131058 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131060 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131063 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131066 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131070 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131074 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131077 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131080 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131083 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131086 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131089 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131092 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:07.137227 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131094 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131097 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131100 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131103 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131106 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131109 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131132 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131135 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131139 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131142 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131144 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131151 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131161 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131166 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131169 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131171 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131174 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131176 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131179 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131183 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:07.137707 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131185 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131188 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131191 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131195 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131197 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131200 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131203 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131205 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131208 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131211 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131213 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131216 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131219 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131222 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131224 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131227 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131229 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131232 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131234 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:07.138226 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131238 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131241 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131243 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.131246 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.131994 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.138639 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.138658 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:15:07.138705 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138706 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138711 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138715 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138718 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138721 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138724 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138727 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138730 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138733 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138737 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138739 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138742 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138745 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138747 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138750 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138753 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138756 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138759 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138761 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138764 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:07.138909 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138767 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138769 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138772 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138775 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138777 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138781 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138783 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138786 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138789 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138791 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138794 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138796 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138799 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138801 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138804 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138807 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138810 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138812 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138815 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138817 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:07.139427 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138820 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138823 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138825 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138828 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138830 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138833 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138835 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138838 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138841 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138843 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138845 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138848 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138850 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138853 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138855 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138858 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138860 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138863 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138866 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:07.139994 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138870 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138874 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138877 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138880 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138897 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138901 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138904 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138908 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138912 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138915 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138918 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138922 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138925 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138928 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138931 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138934 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138937 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138940 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138943 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138946 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:07.140455 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138949 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138952 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138954 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138957 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138960 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138962 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.138965 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.138971 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139072 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139076 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139080 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139083 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139086 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139089 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139092 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:15:07.140971 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139095 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139097 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139100 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139103 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139105 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139110 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139114 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139117 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139120 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139123 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139126 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139129 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139131 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139134 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139137 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139140 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139142 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139145 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139148 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:15:07.141355 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139151 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139154 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139157 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139159 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139162 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139164 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139167 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139170 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139172 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139175 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139177 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139180 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139183 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139186 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139188 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139191 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139194 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139196 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139199 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139202 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:15:07.141827 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139205 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139207 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139210 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139213 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139223 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139226 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139229 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139232 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139234 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139237 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139239 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139242 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139245 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139247 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139249 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139252 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139255 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139257 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139259 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139262 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:15:07.142335 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139264 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139267 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139269 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139272 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139276 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139279 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139281 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139284 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139287 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139290 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139293 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139295 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139298 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139301 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139303 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139305 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139308 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139310 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139312 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:15:07.142818 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:07.139315 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:15:07.143340 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.139320 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:15:07.143340 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.139953 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:15:07.143340 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.143103 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:15:07.143978 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.143964 2571 server.go:1019] "Starting client certificate rotation" Apr 17 20:15:07.144083 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.144067 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:15:07.144412 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.144399 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:15:07.166299 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.166274 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:15:07.171577 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.171553 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:15:07.185567 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.185541 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:15:07.190659 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.190640 2571 log.go:25] "Validated CRI v1 image API" Apr 17 20:15:07.192375 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.192355 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:15:07.192531 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.192518 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:15:07.194643 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.194616 2571 fs.go:135] Filesystem UUIDs: map[700c551e-1c57-4fde-9e0e-69dec812b7f5:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ef3de868-294f-4586-84b2-66f410b81d49:/dev/nvme0n1p3] Apr 17 20:15:07.194718 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.194641 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:15:07.201179 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.201051 2571 manager.go:217] Machine: {Timestamp:2026-04-17 20:15:07.19842657 +0000 UTC m=+0.370126644 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105037 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2049a62bbeccc116d6df94594a8e41 SystemUUID:ec2049a6-2bbe-ccc1-16d6-df94594a8e41 BootID:0c7844b9-e9f4-4a11-b46e-4c504e5d572f Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:86:86:3a:1b:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:86:86:3a:1b:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:c3:f4:11:9e:c3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:15:07.201179 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.201174 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:15:07.201291 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.201268 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:15:07.203622 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.203591 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:15:07.203774 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.203625 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-50.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:15:07.203822 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.203780 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:15:07.203822 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.203789 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:15:07.203822 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.203802 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:15:07.204775 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.204763 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:15:07.205867 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.205857 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:15:07.205995 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.205986 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:15:07.208447 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.208436 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:15:07.208483 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.208452 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:15:07.208483 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.208464 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:15:07.208483 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.208475 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:15:07.208483 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.208483 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:15:07.209464 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.209452 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:15:07.209515 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.209471 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:15:07.212601 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.212586 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:15:07.214226 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.214212 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:15:07.215432 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215414 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:15:07.215481 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215447 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:15:07.215481 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215461 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:15:07.215481 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215473 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:15:07.215575 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215486 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:15:07.215575 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215500 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:15:07.215575 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215535 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:15:07.215575 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215548 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:15:07.215575 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215561 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:15:07.215704 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215581 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:15:07.215704 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215611 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:15:07.215704 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.215628 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:15:07.216405 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.216388 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:15:07.216405 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.216403 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:15:07.220259 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.220237 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:15:07.220371 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.220301 2571 server.go:1295] "Started kubelet" Apr 17 20:15:07.221392 ip-10-0-129-50 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:15:07.222295 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.221542 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:15:07.222295 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.221618 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:15:07.222295 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.221857 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:15:07.223046 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.223028 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:15:07.227057 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.227026 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-50.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:15:07.227185 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.227168 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:15:07.228329 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.228224 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:15:07.228442 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.228327 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-50.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:15:07.229509 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.229491 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:15:07.229611 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.229538 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:15:07.230269 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.230251 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:15:07.230269 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.230270 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:15:07.230395 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.230323 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:15:07.230395 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.229365 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-50.ec2.internal.18a73e2521a70c4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-50.ec2.internal,UID:ip-10-0-129-50.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-50.ec2.internal,},FirstTimestamp:2026-04-17 20:15:07.220257867 +0000 UTC m=+0.391957943,LastTimestamp:2026-04-17 20:15:07.220257867 +0000 UTC m=+0.391957943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-50.ec2.internal,}" Apr 17 20:15:07.230523 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.230419 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:15:07.230523 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.230428 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:15:07.230523 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.230473 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.232310 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232292 2571 factory.go:55] Registering systemd factory Apr 17 20:15:07.232379 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232352 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:15:07.232478 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.232446 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:15:07.232894 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232748 2571 factory.go:153] Registering CRI-O factory Apr 17 20:15:07.232894 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232766 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 20:15:07.232894 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.232812 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 20:15:07.232894 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232822 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:15:07.232894 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.232844 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-50.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 20:15:07.232894 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232864 2571 factory.go:103] Registering Raw factory Apr 17 20:15:07.233140 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.232902 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 20:15:07.233361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.233348 2571 manager.go:319] Starting recovery of all containers Apr 17 20:15:07.236652 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.236623 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t59sv" Apr 17 20:15:07.242341 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.242156 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t59sv" Apr 17 20:15:07.243901 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.243789 2571 manager.go:324] Recovery completed Apr 17 20:15:07.249409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.249390 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:07.251752 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.251733 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:07.251813 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.251763 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:07.251813 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.251774 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:07.252348 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.252333 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:15:07.252348 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.252344 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:15:07.252445 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.252362 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:15:07.253831 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.253759 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-50.ec2.internal.18a73e252387982c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-50.ec2.internal,UID:ip-10-0-129-50.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-50.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-50.ec2.internal,},FirstTimestamp:2026-04-17 20:15:07.251750956 +0000 UTC m=+0.423451028,LastTimestamp:2026-04-17 20:15:07.251750956 +0000 UTC m=+0.423451028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-50.ec2.internal,}" Apr 17 20:15:07.254525 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.254511 2571 policy_none.go:49] "None policy: Start" Apr 17 20:15:07.254568 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.254534 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:15:07.254568 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.254551 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:15:07.296548 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.296523 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.296584 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.296599 2571 server.go:85] "Starting device plugin registration server" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.297011 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.297023 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.297160 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.297261 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.297270 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.297817 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:15:07.301671 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.297854 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.332648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.332612 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:15:07.334102 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.334083 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:15:07.334216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.334110 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:15:07.334216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.334129 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:15:07.334216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.334136 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:15:07.334216 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.334168 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:15:07.340392 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.340375 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:07.397212 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.397107 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:07.398142 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.398127 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:07.398200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.398158 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:07.398200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.398169 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:07.398200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.398193 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.406849 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.406834 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.406924 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.406857 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-50.ec2.internal\": node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.424249 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.424224 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.435039 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.435008 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal"] Apr 17 20:15:07.435142 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.435074 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:07.436004 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.435985 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:07.436114 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.436017 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:07.436114 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.436029 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:07.437320 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.437307 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:07.437444 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.437429 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.437484 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.437458 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:07.438065 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.438045 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:07.438166 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.438080 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:07.438166 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.438093 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:07.438233 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.438053 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:07.438233 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.438225 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:07.438290 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.438236 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:07.439262 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.439246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.439315 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.439278 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:15:07.439980 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.439964 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:15:07.440049 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.439996 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:15:07.440049 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.440013 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:15:07.457520 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.457494 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-50.ec2.internal\" not found" node="ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.462135 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.462110 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-50.ec2.internal\" not found" node="ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.525044 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.525005 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.531996 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.531961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3558e7db711dfe8c57fb2595118440f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal\" (UID: \"3558e7db711dfe8c57fb2595118440f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.532137 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.531999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3558e7db711dfe8c57fb2595118440f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal\" (UID: \"3558e7db711dfe8c57fb2595118440f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.532137 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.532024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06b467c1ac790ac275ddafdf170566ce-config\") pod \"kube-apiserver-proxy-ip-10-0-129-50.ec2.internal\" (UID: \"06b467c1ac790ac275ddafdf170566ce\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.625567 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.625537 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.632963 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.632937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3558e7db711dfe8c57fb2595118440f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal\" (UID: \"3558e7db711dfe8c57fb2595118440f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.633021 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.633010 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3558e7db711dfe8c57fb2595118440f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal\" (UID: \"3558e7db711dfe8c57fb2595118440f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.633054 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.633034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3558e7db711dfe8c57fb2595118440f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal\" (UID: \"3558e7db711dfe8c57fb2595118440f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.633092 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.633061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3558e7db711dfe8c57fb2595118440f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal\" (UID: \"3558e7db711dfe8c57fb2595118440f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.633092 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.633066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06b467c1ac790ac275ddafdf170566ce-config\") pod \"kube-apiserver-proxy-ip-10-0-129-50.ec2.internal\" (UID: \"06b467c1ac790ac275ddafdf170566ce\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.633150 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.633108 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06b467c1ac790ac275ddafdf170566ce-config\") pod \"kube-apiserver-proxy-ip-10-0-129-50.ec2.internal\" (UID: \"06b467c1ac790ac275ddafdf170566ce\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.726438 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.726372 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.758845 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.758808 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.764280 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:07.764260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" Apr 17 20:15:07.826905 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.826839 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:07.927471 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:07.927431 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:08.027971 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:08.027940 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:08.128680 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:08.128641 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:08.145100 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.145076 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:15:08.145241 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.145224 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:15:08.229345 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:08.229316 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:08.230448 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.230432 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:15:08.244014 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.243995 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:15:08.245014 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.244980 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:10:07 +0000 UTC" deadline="2027-09-21 08:49:20.581269595 +0000 UTC" Apr 17 20:15:08.245069 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.245020 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12516h34m12.336257397s" Apr 17 20:15:08.262281 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.262238 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zlbpb" Apr 17 20:15:08.269422 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.269386 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zlbpb" Apr 17 20:15:08.329789 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:08.329699 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:08.363351 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:08.363317 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3558e7db711dfe8c57fb2595118440f7.slice/crio-832a133b8d0d52a6fd174ea88d63858187ee7cf5ccada92f57ce88754795de1c WatchSource:0}: Error finding container 832a133b8d0d52a6fd174ea88d63858187ee7cf5ccada92f57ce88754795de1c: Status 404 returned error can't find the container with id 832a133b8d0d52a6fd174ea88d63858187ee7cf5ccada92f57ce88754795de1c Apr 17 20:15:08.363914 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:08.363860 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06b467c1ac790ac275ddafdf170566ce.slice/crio-5d0dcb907c752b646f384bbf2aadb24192bab979a5cf0b76f84694d48f996737 WatchSource:0}: Error finding container 5d0dcb907c752b646f384bbf2aadb24192bab979a5cf0b76f84694d48f996737: Status 404 returned error can't find the container with id 5d0dcb907c752b646f384bbf2aadb24192bab979a5cf0b76f84694d48f996737 Apr 17 20:15:08.367784 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.367768 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:15:08.430583 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:08.430544 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-50.ec2.internal\" not found" Apr 17 20:15:08.480229 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.480201 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:08.529825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.529783 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" Apr 17 20:15:08.542899 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.542854 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:15:08.544475 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.544459 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" Apr 17 20:15:08.551935 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.551911 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:15:08.646691 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.646602 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:08.744234 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:08.742963 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:09.210041 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.210009 2571 apiserver.go:52] "Watching apiserver" Apr 17 20:15:09.219476 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.219445 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:15:09.220292 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.219895 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9qqd7","openshift-multus/network-metrics-daemon-pwx7f","openshift-network-diagnostics/network-check-target-z2fcw","openshift-cluster-node-tuning-operator/tuned-b78sb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal","openshift-multus/multus-additional-cni-plugins-ks7qb","openshift-network-operator/iptables-alerter-k2w9b","openshift-ovn-kubernetes/ovnkube-node-tmwcg","kube-system/konnectivity-agent-vhbpf","kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw","openshift-dns/node-resolver-kjf7q","openshift-image-registry/node-ca-n66bj"] Apr 17 20:15:09.222839 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.222484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.225214 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.224898 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:15:09.225214 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.225011 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.225214 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.225020 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.225214 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.225100 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-h88qx\"" Apr 17 20:15:09.226670 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.226650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.226778 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.226716 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:09.226778 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.226740 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:09.226904 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.226790 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:09.228942 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.228918 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.230771 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.230726 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.231202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.231182 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.231295 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.231206 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gp4jr\"" Apr 17 20:15:09.233494 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.233467 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.238203 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.238178 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.238489 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.238464 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:15:09.238652 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.238628 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:15:09.238852 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.238836 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.239044 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.239023 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gqhsv\"" Apr 17 20:15:09.240833 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.240810 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.241668 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-sys\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.241764 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c85788b-86c1-4378-923b-48b43a9c6100-etc-tuned\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.241764 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241696 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-system-cni-dir\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.241764 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241722 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-os-release\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.241764 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241746 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da978ea1-70f9-45a3-9f4e-e26cc21544b2-iptables-alerter-script\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-run\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlr4g\" (UniqueName: \"kubernetes.io/projected/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-kube-api-access-zlr4g\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da978ea1-70f9-45a3-9f4e-e26cc21544b2-host-slash\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysctl-d\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fptk\" (UniqueName: \"kubernetes.io/projected/4c85788b-86c1-4378-923b-48b43a9c6100-kube-api-access-7fptk\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242035 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.241994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cnibin\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-kubernetes\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242063 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-var-lib-kubelet\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-lib-modules\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwwj\" (UniqueName: \"kubernetes.io/projected/da978ea1-70f9-45a3-9f4e-e26cc21544b2-kube-api-access-kxwwj\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-systemd\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-host\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242289 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c85788b-86c1-4378-923b-48b43a9c6100-tmp\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.242409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zqs\" (UniqueName: \"kubernetes.io/projected/85885202-a740-4319-827b-236ded2de085-kube-api-access-x6zqs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.242983 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-modprobe-d\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242983 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysconfig\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242983 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysctl-conf\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.242983 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.242808 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.243201 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.243040 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:15:09.243201 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.243111 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:15:09.243324 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.243312 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.247138 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.247104 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:15:09.247838 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.247813 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:15:09.248006 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.247991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.248192 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.248172 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l6bsz\"" Apr 17 20:15:09.250216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.250191 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:15:09.250959 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.250940 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-w2cwx\"" Apr 17 20:15:09.251094 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.250972 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:15:09.251172 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.251163 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.251330 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.251306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.254489 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.254468 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.255062 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255041 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:15:09.255145 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255086 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8dfth\"" Apr 17 20:15:09.255145 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255087 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.255249 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255235 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6njhp\"" Apr 17 20:15:09.255334 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255294 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:15:09.255385 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255373 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:15:09.255434 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.255394 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.256470 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.256445 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cv7rk\"" Apr 17 20:15:09.256632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.256612 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.257050 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.257030 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.257610 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.257595 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.260063 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.260015 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:15:09.260063 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.260041 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:15:09.260216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.260076 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ct92n\"" Apr 17 20:15:09.260216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.260021 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:15:09.270374 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.270344 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:10:08 +0000 UTC" deadline="2027-09-21 22:49:24.343419799 +0000 UTC" Apr 17 20:15:09.270479 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.270373 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12530h34m15.073049285s" Apr 17 20:15:09.272974 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.272951 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:15:09.331634 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.331553 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:15:09.340096 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.340010 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" event={"ID":"3558e7db711dfe8c57fb2595118440f7","Type":"ContainerStarted","Data":"832a133b8d0d52a6fd174ea88d63858187ee7cf5ccada92f57ce88754795de1c"} Apr 17 20:15:09.341563 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.341533 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" event={"ID":"06b467c1ac790ac275ddafdf170566ce","Type":"ContainerStarted","Data":"5d0dcb907c752b646f384bbf2aadb24192bab979a5cf0b76f84694d48f996737"} Apr 17 20:15:09.342773 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-sys\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.342873 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342788 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-os-release\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.342873 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.342873 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-systemd-units\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.342873 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-socket-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.343182 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da978ea1-70f9-45a3-9f4e-e26cc21544b2-host-slash\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.343263 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-serviceca\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.343311 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-system-cni-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.343364 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343321 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-etc-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.343405 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343360 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.343405 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cnibin\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.343500 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-hostroot\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.343500 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovnkube-script-lib\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.343500 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343487 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.343647 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-kubernetes\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.343647 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.343647 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343587 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/785f90d6-d02c-465a-8c2b-761d1fb1e10b-agent-certs\") pod \"konnectivity-agent-vhbpf\" (UID: \"785f90d6-d02c-465a-8c2b-761d1fb1e10b\") " pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.343647 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343618 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-tmp-dir\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.343761 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343647 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-ovn\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.343761 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343676 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-env-overrides\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.343761 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwwj\" (UniqueName: \"kubernetes.io/projected/da978ea1-70f9-45a3-9f4e-e26cc21544b2-kube-api-access-kxwwj\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.343858 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.342871 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-sys\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.343858 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-os-release\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.343971 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.343855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da978ea1-70f9-45a3-9f4e-e26cc21544b2-host-slash\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.344733 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344415 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cnibin\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.344733 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.344733 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-kubernetes\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.344733 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.344987 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b2daba4-d463-4f69-96fb-a701b6be9b79-cni-binary-copy\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.344987 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-k8s-cni-cncf-io\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.344987 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-kubelet\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.344987 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-cni-netd\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.345141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.344986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-device-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.345141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-systemd\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.345141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-host\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.345141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c85788b-86c1-4378-923b-48b43a9c6100-tmp\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.345370 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-multus-certs\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.345370 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-etc-kubernetes\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.345370 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.345370 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.345370 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-hosts-file\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.345370 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345346 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-host\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.345635 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-netns\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.345635 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-cni-multus\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.345635 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345608 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:15:09.345765 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.345765 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345715 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-cni-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.345765 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-cnibin\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.345860 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.345784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c85788b-86c1-4378-923b-48b43a9c6100-etc-tuned\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.346365 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.346242 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:09.346365 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.346360 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:09.846319343 +0000 UTC m=+3.018019418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:09.346549 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.346528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-systemd\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.346607 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.346584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-system-cni-dir\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.346760 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.346741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.348862 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.347715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da978ea1-70f9-45a3-9f4e-e26cc21544b2-iptables-alerter-script\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.348862 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.347779 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-conf-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.348862 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.348465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.348862 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.348551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-system-cni-dir\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-run\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlr4g\" (UniqueName: \"kubernetes.io/projected/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-kube-api-access-zlr4g\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-run\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/785f90d6-d02c-465a-8c2b-761d1fb1e10b-konnectivity-ca\") pod \"konnectivity-agent-vhbpf\" (UID: \"785f90d6-d02c-465a-8c2b-761d1fb1e10b\") " pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349337 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da978ea1-70f9-45a3-9f4e-e26cc21544b2-iptables-alerter-script\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349407 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdkqr\" (UniqueName: \"kubernetes.io/projected/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-kube-api-access-vdkqr\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.349550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.349437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-node-log\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.350272 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c85788b-86c1-4378-923b-48b43a9c6100-etc-tuned\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350378 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkj9c\" (UniqueName: \"kubernetes.io/projected/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-kube-api-access-xkj9c\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.350378 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysctl-d\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350486 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fptk\" (UniqueName: \"kubernetes.io/projected/4c85788b-86c1-4378-923b-48b43a9c6100-kube-api-access-7fptk\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350486 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350435 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-daemon-config\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.350621 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysctl-d\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350621 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-cni-bin\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.350621 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350566 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c85788b-86c1-4378-923b-48b43a9c6100-tmp\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350621 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-registration-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.350621 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-var-lib-kubelet\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350868 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-var-lib-kubelet\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.350868 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.350868 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovn-node-metrics-cert\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.350868 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-lib-modules\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.351085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350906 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-systemd\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.351085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-var-lib-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.351085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.350982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:09.351085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-lib-modules\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.351085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-kubelet\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.351356 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351149 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-run-netns\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.351710 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351680 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-log-socket\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.351814 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-sys-fs\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.351814 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zqs\" (UniqueName: \"kubernetes.io/projected/85885202-a740-4319-827b-236ded2de085-kube-api-access-x6zqs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.351814 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-host\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.351989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljmfp\" (UniqueName: \"kubernetes.io/projected/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-kube-api-access-ljmfp\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.351989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-os-release\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.351989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351907 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn68n\" (UniqueName: \"kubernetes.io/projected/4b2daba4-d463-4f69-96fb-a701b6be9b79-kube-api-access-nn68n\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.351989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.351948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovnkube-config\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.352202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.352202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352031 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-modprobe-d\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.352202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352070 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysconfig\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.352202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysctl-conf\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.352202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysconfig\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.352202 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-slash\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.352498 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-modprobe-d\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.352498 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352211 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sj5x\" (UniqueName: \"kubernetes.io/projected/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-kube-api-access-6sj5x\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.352498 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-socket-dir-parent\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.352498 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-cni-bin\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.352498 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.352285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c85788b-86c1-4378-923b-48b43a9c6100-etc-sysctl-conf\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.353223 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.353201 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.355741 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.355717 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:09.355741 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.355744 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:09.355914 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.355757 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:09.355914 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.355795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwwj\" (UniqueName: \"kubernetes.io/projected/da978ea1-70f9-45a3-9f4e-e26cc21544b2-kube-api-access-kxwwj\") pod \"iptables-alerter-k2w9b\" (UID: \"da978ea1-70f9-45a3-9f4e-e26cc21544b2\") " pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.355914 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.355865 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:15:09.855844658 +0000 UTC m=+3.027544722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:09.357714 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.357694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlr4g\" (UniqueName: \"kubernetes.io/projected/5b1dd3fa-14f5-434c-919f-b8ceabccf2b6-kube-api-access-zlr4g\") pod \"multus-additional-cni-plugins-ks7qb\" (UID: \"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6\") " pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.366376 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.366351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zqs\" (UniqueName: \"kubernetes.io/projected/85885202-a740-4319-827b-236ded2de085-kube-api-access-x6zqs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.372110 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.372084 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fptk\" (UniqueName: \"kubernetes.io/projected/4c85788b-86c1-4378-923b-48b43a9c6100-kube-api-access-7fptk\") pod \"tuned-b78sb\" (UID: \"4c85788b-86c1-4378-923b-48b43a9c6100\") " pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.452899 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.452849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-hostroot\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.452975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-hostroot\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovnkube-script-lib\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.453076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/785f90d6-d02c-465a-8c2b-761d1fb1e10b-agent-certs\") pod \"konnectivity-agent-vhbpf\" (UID: \"785f90d6-d02c-465a-8c2b-761d1fb1e10b\") " pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-tmp-dir\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-ovn\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-env-overrides\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453178 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b2daba4-d463-4f69-96fb-a701b6be9b79-cni-binary-copy\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-k8s-cni-cncf-io\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-kubelet\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-cni-netd\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-device-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-multus-certs\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453394 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-etc-kubernetes\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-tmp-dir\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-hosts-file\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453474 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-cni-netd\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-netns\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-cni-multus\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-netns\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-cni-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-device-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-cnibin\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-hosts-file\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-conf-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/785f90d6-d02c-465a-8c2b-761d1fb1e10b-konnectivity-ca\") pod \"konnectivity-agent-vhbpf\" (UID: \"785f90d6-d02c-465a-8c2b-761d1fb1e10b\") " pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdkqr\" (UniqueName: \"kubernetes.io/projected/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-kube-api-access-vdkqr\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453705 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-k8s-cni-cncf-io\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-node-log\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453742 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkj9c\" (UniqueName: \"kubernetes.io/projected/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-kube-api-access-xkj9c\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-cnibin\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-daemon-config\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453792 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-run-multus-certs\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-cni-bin\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-etc-kubernetes\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453834 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-registration-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.453863 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-cni-multus\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovn-node-metrics-cert\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-systemd\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-var-lib-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-cni-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-kubelet\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-env-overrides\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-node-log\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-run-netns\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovnkube-script-lib\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-conf-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-log-socket\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-sys-fs\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-host\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljmfp\" (UniqueName: \"kubernetes.io/projected/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-kube-api-access-ljmfp\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b2daba4-d463-4f69-96fb-a701b6be9b79-cni-binary-copy\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-os-release\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454361 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn68n\" (UniqueName: \"kubernetes.io/projected/4b2daba4-d463-4f69-96fb-a701b6be9b79-kube-api-access-nn68n\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovnkube-config\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-kubelet\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-ovn\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-run-netns\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-slash\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-slash\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sj5x\" (UniqueName: \"kubernetes.io/projected/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-kube-api-access-6sj5x\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-socket-dir-parent\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-cni-bin\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/785f90d6-d02c-465a-8c2b-761d1fb1e10b-konnectivity-ca\") pod \"konnectivity-agent-vhbpf\" (UID: \"785f90d6-d02c-465a-8c2b-761d1fb1e10b\") " pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-systemd-units\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-log-socket\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.453281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-kubelet\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-socket-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-sys-fs\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-serviceca\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.454825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-system-cni-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-host\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-etc-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454673 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-etc-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-socket-dir-parent\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-host-var-lib-cni-bin\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454790 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-systemd-units\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-run-systemd\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-cni-bin\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-os-release\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.454945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-registration-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b2daba4-d463-4f69-96fb-a701b6be9b79-multus-daemon-config\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-var-lib-openvswitch\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455308 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-socket-dir\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.455532 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455482 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b2daba4-d463-4f69-96fb-a701b6be9b79-system-cni-dir\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.456152 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-serviceca\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.456152 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovnkube-config\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.456152 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.455823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/785f90d6-d02c-465a-8c2b-761d1fb1e10b-agent-certs\") pod \"konnectivity-agent-vhbpf\" (UID: \"785f90d6-d02c-465a-8c2b-761d1fb1e10b\") " pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.457584 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.457560 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-ovn-node-metrics-cert\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.463727 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.463612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdkqr\" (UniqueName: \"kubernetes.io/projected/eeee41e0-c56f-4312-97f4-258ec5fc4d4d-kube-api-access-vdkqr\") pod \"node-resolver-kjf7q\" (UID: \"eeee41e0-c56f-4312-97f4-258ec5fc4d4d\") " pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.463727 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.463638 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkj9c\" (UniqueName: \"kubernetes.io/projected/968f9d19-7fd2-4b4b-9b77-d24caa01a1ea-kube-api-access-xkj9c\") pod \"aws-ebs-csi-driver-node-sprdw\" (UID: \"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.463875 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.463749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sj5x\" (UniqueName: \"kubernetes.io/projected/fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26-kube-api-access-6sj5x\") pod \"ovnkube-node-tmwcg\" (UID: \"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.465290 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.465270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn68n\" (UniqueName: \"kubernetes.io/projected/4b2daba4-d463-4f69-96fb-a701b6be9b79-kube-api-access-nn68n\") pod \"multus-9qqd7\" (UID: \"4b2daba4-d463-4f69-96fb-a701b6be9b79\") " pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.466116 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.466099 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljmfp\" (UniqueName: \"kubernetes.io/projected/e2f38744-a4e3-4cc0-94fe-0190e5c2c772-kube-api-access-ljmfp\") pod \"node-ca-n66bj\" (UID: \"e2f38744-a4e3-4cc0-94fe-0190e5c2c772\") " pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.538224 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.538176 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k2w9b" Apr 17 20:15:09.551525 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.551494 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b78sb" Apr 17 20:15:09.563342 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.563312 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9qqd7" Apr 17 20:15:09.571182 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.571152 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:09.580383 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.580044 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" Apr 17 20:15:09.588826 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.588793 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:09.596663 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.596627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" Apr 17 20:15:09.606648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.606618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kjf7q" Apr 17 20:15:09.612400 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.612375 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n66bj" Apr 17 20:15:09.858001 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.857967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:09.858168 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:09.858031 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:09.858168 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.858129 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:09.858168 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.858147 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:09.858168 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.858152 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:09.858168 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.858170 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:09.858362 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.858210 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:10.858192531 +0000 UTC m=+4.029892612 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:09.858362 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:09.858229 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:15:10.858221194 +0000 UTC m=+4.029921254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:10.209970 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.209937 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b1dd3fa_14f5_434c_919f_b8ceabccf2b6.slice/crio-8c3d85bbace017b095905621855d041052358411035d168bd62e1e43d76e97c2 WatchSource:0}: Error finding container 8c3d85bbace017b095905621855d041052358411035d168bd62e1e43d76e97c2: Status 404 returned error can't find the container with id 8c3d85bbace017b095905621855d041052358411035d168bd62e1e43d76e97c2 Apr 17 20:15:10.211998 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.211958 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod968f9d19_7fd2_4b4b_9b77_d24caa01a1ea.slice/crio-19850692edf77d1be954b72cfdb710e9a142e3453d47b9dc41008b4c71e8557a WatchSource:0}: Error finding container 19850692edf77d1be954b72cfdb710e9a142e3453d47b9dc41008b4c71e8557a: Status 404 returned error can't find the container with id 19850692edf77d1be954b72cfdb710e9a142e3453d47b9dc41008b4c71e8557a Apr 17 20:15:10.214303 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.214264 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785f90d6_d02c_465a_8c2b_761d1fb1e10b.slice/crio-9052536f5d5105fbce15fd111b2b6f483e8c5c86daf022e4c34ce9110fe7e720 WatchSource:0}: Error finding container 9052536f5d5105fbce15fd111b2b6f483e8c5c86daf022e4c34ce9110fe7e720: Status 404 returned error can't find the container with id 9052536f5d5105fbce15fd111b2b6f483e8c5c86daf022e4c34ce9110fe7e720 Apr 17 20:15:10.216723 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.216692 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c85788b_86c1_4378_923b_48b43a9c6100.slice/crio-52474c47cfb08b1ec3e7bf89c0fe20ceb0ca35028099b2aefb22eb9d779e6008 WatchSource:0}: Error finding container 52474c47cfb08b1ec3e7bf89c0fe20ceb0ca35028099b2aefb22eb9d779e6008: Status 404 returned error can't find the container with id 52474c47cfb08b1ec3e7bf89c0fe20ceb0ca35028099b2aefb22eb9d779e6008 Apr 17 20:15:10.218129 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.218099 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f38744_a4e3_4cc0_94fe_0190e5c2c772.slice/crio-0d317ed769b50f486731c78f0270acc3b147e831d85555d5e4a851b57416bb92 WatchSource:0}: Error finding container 0d317ed769b50f486731c78f0270acc3b147e831d85555d5e4a851b57416bb92: Status 404 returned error can't find the container with id 0d317ed769b50f486731c78f0270acc3b147e831d85555d5e4a851b57416bb92 Apr 17 20:15:10.221502 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.221482 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeee41e0_c56f_4312_97f4_258ec5fc4d4d.slice/crio-e74b6b39207caca50cdb143cf10ca2faef9544c40be5b11a070dde83cfd13427 WatchSource:0}: Error finding container e74b6b39207caca50cdb143cf10ca2faef9544c40be5b11a070dde83cfd13427: Status 404 returned error can't find the container with id e74b6b39207caca50cdb143cf10ca2faef9544c40be5b11a070dde83cfd13427 Apr 17 20:15:10.244132 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.243857 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2daba4_d463_4f69_96fb_a701b6be9b79.slice/crio-ea26ef1cc7c3bfaa05a69ff596a1420de7a5c10a27a1d233d102cc9b6a9884a0 WatchSource:0}: Error finding container ea26ef1cc7c3bfaa05a69ff596a1420de7a5c10a27a1d233d102cc9b6a9884a0: Status 404 returned error can't find the container with id ea26ef1cc7c3bfaa05a69ff596a1420de7a5c10a27a1d233d102cc9b6a9884a0 Apr 17 20:15:10.244607 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.244534 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf008c7_b5c3_4d78_9dcc_a9522d8e0a26.slice/crio-4daae80b617a4673c128d7eae6a4ac8b8f6e43865789dde1a6eb77e5fd0508ba WatchSource:0}: Error finding container 4daae80b617a4673c128d7eae6a4ac8b8f6e43865789dde1a6eb77e5fd0508ba: Status 404 returned error can't find the container with id 4daae80b617a4673c128d7eae6a4ac8b8f6e43865789dde1a6eb77e5fd0508ba Apr 17 20:15:10.245487 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:10.245467 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda978ea1_70f9_45a3_9f4e_e26cc21544b2.slice/crio-fd824c449bddbb72c9a2db1a1553e7c3aca5c3b58b4834186ff086ada93a2103 WatchSource:0}: Error finding container fd824c449bddbb72c9a2db1a1553e7c3aca5c3b58b4834186ff086ada93a2103: Status 404 returned error can't find the container with id fd824c449bddbb72c9a2db1a1553e7c3aca5c3b58b4834186ff086ada93a2103 Apr 17 20:15:10.270945 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.270907 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:10:08 +0000 UTC" deadline="2027-10-27 10:44:40.559897918 +0000 UTC" Apr 17 20:15:10.270945 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.270943 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13382h29m30.288957322s" Apr 17 20:15:10.334656 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.334627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:10.334806 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.334731 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:10.344209 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.344178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9qqd7" event={"ID":"4b2daba4-d463-4f69-96fb-a701b6be9b79","Type":"ContainerStarted","Data":"ea26ef1cc7c3bfaa05a69ff596a1420de7a5c10a27a1d233d102cc9b6a9884a0"} Apr 17 20:15:10.345077 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.345051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vhbpf" event={"ID":"785f90d6-d02c-465a-8c2b-761d1fb1e10b","Type":"ContainerStarted","Data":"9052536f5d5105fbce15fd111b2b6f483e8c5c86daf022e4c34ce9110fe7e720"} Apr 17 20:15:10.346051 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.346017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k2w9b" event={"ID":"da978ea1-70f9-45a3-9f4e-e26cc21544b2","Type":"ContainerStarted","Data":"fd824c449bddbb72c9a2db1a1553e7c3aca5c3b58b4834186ff086ada93a2103"} Apr 17 20:15:10.346947 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.346927 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b78sb" event={"ID":"4c85788b-86c1-4378-923b-48b43a9c6100","Type":"ContainerStarted","Data":"52474c47cfb08b1ec3e7bf89c0fe20ceb0ca35028099b2aefb22eb9d779e6008"} Apr 17 20:15:10.347776 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.347737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" event={"ID":"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea","Type":"ContainerStarted","Data":"19850692edf77d1be954b72cfdb710e9a142e3453d47b9dc41008b4c71e8557a"} Apr 17 20:15:10.348673 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.348653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerStarted","Data":"8c3d85bbace017b095905621855d041052358411035d168bd62e1e43d76e97c2"} Apr 17 20:15:10.350217 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.350199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" event={"ID":"06b467c1ac790ac275ddafdf170566ce","Type":"ContainerStarted","Data":"93d5bd932a877a5fd7d9522936f5b041bd6466232ea8d52ac62391c3c3b02c89"} Apr 17 20:15:10.351204 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.351182 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n66bj" event={"ID":"e2f38744-a4e3-4cc0-94fe-0190e5c2c772","Type":"ContainerStarted","Data":"0d317ed769b50f486731c78f0270acc3b147e831d85555d5e4a851b57416bb92"} Apr 17 20:15:10.352158 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.352138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"4daae80b617a4673c128d7eae6a4ac8b8f6e43865789dde1a6eb77e5fd0508ba"} Apr 17 20:15:10.352982 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.352962 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kjf7q" event={"ID":"eeee41e0-c56f-4312-97f4-258ec5fc4d4d","Type":"ContainerStarted","Data":"e74b6b39207caca50cdb143cf10ca2faef9544c40be5b11a070dde83cfd13427"} Apr 17 20:15:10.361441 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.361399 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-50.ec2.internal" podStartSLOduration=2.361386355 podStartE2EDuration="2.361386355s" podCreationTimestamp="2026-04-17 20:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:15:10.361179751 +0000 UTC m=+3.532879821" watchObservedRunningTime="2026-04-17 20:15:10.361386355 +0000 UTC m=+3.533086435" Apr 17 20:15:10.864678 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.864638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:10.864855 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:10.864717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:10.864992 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.864899 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:10.864992 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.864919 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:10.864992 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.864932 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:10.864992 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.864991 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:15:12.864972377 +0000 UTC m=+6.036672459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:10.865453 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.865417 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:10.865552 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:10.865505 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:12.865462827 +0000 UTC m=+6.037162903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:11.340136 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:11.339245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:11.340136 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:11.339400 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:11.376334 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:11.376018 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" event={"ID":"3558e7db711dfe8c57fb2595118440f7","Type":"ContainerStarted","Data":"0e9f53958e31c83268cfafc7a5e229db159fb4b80ad4e9b85cf4d999303e36ea"} Apr 17 20:15:12.335063 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:12.335027 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:12.335254 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.335162 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:12.394751 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:12.394009 2571 generic.go:358] "Generic (PLEG): container finished" podID="3558e7db711dfe8c57fb2595118440f7" containerID="0e9f53958e31c83268cfafc7a5e229db159fb4b80ad4e9b85cf4d999303e36ea" exitCode=0 Apr 17 20:15:12.394751 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:12.394084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" event={"ID":"3558e7db711dfe8c57fb2595118440f7","Type":"ContainerDied","Data":"0e9f53958e31c83268cfafc7a5e229db159fb4b80ad4e9b85cf4d999303e36ea"} Apr 17 20:15:12.394751 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:12.394114 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" event={"ID":"3558e7db711dfe8c57fb2595118440f7","Type":"ContainerStarted","Data":"20fb335c978cfaa15f228696195891e308c5ec9eaef181aac2dddf530252ac64"} Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:12.883061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:12.883132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.883300 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.883318 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.883332 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.883391 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:15:16.883373583 +0000 UTC m=+10.055073648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.883809 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:12.883917 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:12.883861 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:16.883843671 +0000 UTC m=+10.055543744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:13.335383 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:13.335347 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:13.335592 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:13.335495 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:14.334491 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:14.334451 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:14.334973 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:14.334592 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:15.334420 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:15.334373 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:15.334607 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:15.334499 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:16.334977 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:16.334922 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:16.335454 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.335063 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:16.916661 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:16.916617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:16.916840 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:16.916690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:16.916840 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.916750 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:16.916840 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.916812 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:16.916840 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.916824 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:24.916808351 +0000 UTC m=+18.088508410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:16.916840 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.916829 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:16.917137 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.916846 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:16.917137 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:16.916921 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:15:24.916906032 +0000 UTC m=+18.088606108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:17.334925 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:17.334872 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:17.335108 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:17.335040 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:18.334550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:18.334511 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:18.334757 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:18.334643 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:19.334727 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:19.334694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:19.335210 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:19.334828 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:20.335024 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:20.334988 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:20.335473 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:20.335116 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:21.334821 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:21.334779 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:21.335023 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:21.334951 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:22.335327 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:22.335285 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:22.335755 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:22.335426 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:23.334678 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:23.334637 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:23.334869 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:23.334789 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:24.334829 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:24.334781 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:24.335248 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.334931 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:24.974419 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:24.974365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:24.974453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.974559 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.974559 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.974592 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.974605 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.974620 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:40.974603247 +0000 UTC m=+34.146303323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:24.974702 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:24.974661 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:15:40.974644058 +0000 UTC m=+34.146344140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:25.335118 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:25.335076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:25.335606 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:25.335224 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:26.334833 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:26.334786 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:26.335029 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:26.334922 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:27.335086 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:27.335051 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:27.335541 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:27.335156 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:28.334806 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.334574 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:28.334997 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:28.334945 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:28.427861 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.427808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n66bj" event={"ID":"e2f38744-a4e3-4cc0-94fe-0190e5c2c772","Type":"ContainerStarted","Data":"ca6a80d12602a3f120af9f5e6f5a044776ac0eb674f605fb2085e57b3022e9be"} Apr 17 20:15:28.431820 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.431190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"149d41c5b7ef41d3027fb0b93767f2316c8e27458c6b832fe7f9744631863b79"} Apr 17 20:15:28.431820 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.431228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"da04616d7ceca7fac0b14ce5e0c6eaa36dd98d35df3a14b223d9230808206b2b"} Apr 17 20:15:28.431820 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.431243 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"0b3099097026bdfa0d9530bb1dc7550b2115594a2039c12e9e13c8903332104b"} Apr 17 20:15:28.432596 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.432569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kjf7q" event={"ID":"eeee41e0-c56f-4312-97f4-258ec5fc4d4d","Type":"ContainerStarted","Data":"773cc424035e630cfae5e83b030f7ab0a8ce377a1e312e98e75d86fd8ae5a664"} Apr 17 20:15:28.434083 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.434056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9qqd7" event={"ID":"4b2daba4-d463-4f69-96fb-a701b6be9b79","Type":"ContainerStarted","Data":"78885f35309786839a1486ebebce8eead7656ada51eb91f9869921412b482e76"} Apr 17 20:15:28.436088 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.435733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vhbpf" event={"ID":"785f90d6-d02c-465a-8c2b-761d1fb1e10b","Type":"ContainerStarted","Data":"3ff009cd1048efb17e2734f5d96737cd2eb198a03dbcef321837906dd02c36bd"} Apr 17 20:15:28.437181 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.437156 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b78sb" event={"ID":"4c85788b-86c1-4378-923b-48b43a9c6100","Type":"ContainerStarted","Data":"ded02528b352a64695055366e6d99ecdbefe649a860529a82e71b507e2811c4d"} Apr 17 20:15:28.438622 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.438590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" event={"ID":"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea","Type":"ContainerStarted","Data":"cc1dc48e74e43b91934d1f7ca5c90116bcca289b3734b3f5a53830c247e986a7"} Apr 17 20:15:28.440085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.440060 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b1dd3fa-14f5-434c-919f-b8ceabccf2b6" containerID="74583811d8ee4f720685537828ffcb8fbfb96e89b6d0a7057f442deec01abec1" exitCode=0 Apr 17 20:15:28.440152 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.440098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerDied","Data":"74583811d8ee4f720685537828ffcb8fbfb96e89b6d0a7057f442deec01abec1"} Apr 17 20:15:28.453339 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.453298 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n66bj" podStartSLOduration=8.578187731 podStartE2EDuration="21.453283349s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.242427814 +0000 UTC m=+3.414127877" lastFinishedPulling="2026-04-17 20:15:23.117523436 +0000 UTC m=+16.289223495" observedRunningTime="2026-04-17 20:15:28.452971301 +0000 UTC m=+21.624671413" watchObservedRunningTime="2026-04-17 20:15:28.453283349 +0000 UTC m=+21.624983421" Apr 17 20:15:28.453653 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.453629 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-50.ec2.internal" podStartSLOduration=20.453621119 podStartE2EDuration="20.453621119s" podCreationTimestamp="2026-04-17 20:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:15:12.407224933 +0000 UTC m=+5.578925017" watchObservedRunningTime="2026-04-17 20:15:28.453621119 +0000 UTC m=+21.625321190" Apr 17 20:15:28.469081 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.469028 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b78sb" podStartSLOduration=4.116477461 podStartE2EDuration="21.469012962s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.2211094 +0000 UTC m=+3.392809462" lastFinishedPulling="2026-04-17 20:15:27.573644882 +0000 UTC m=+20.745344963" observedRunningTime="2026-04-17 20:15:28.468601948 +0000 UTC m=+21.640302063" watchObservedRunningTime="2026-04-17 20:15:28.469012962 +0000 UTC m=+21.640713043" Apr 17 20:15:28.530184 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.530141 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9qqd7" podStartSLOduration=4.171255483 podStartE2EDuration="21.530127081s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.247972397 +0000 UTC m=+3.419672457" lastFinishedPulling="2026-04-17 20:15:27.606843995 +0000 UTC m=+20.778544055" observedRunningTime="2026-04-17 20:15:28.530016516 +0000 UTC m=+21.701716597" watchObservedRunningTime="2026-04-17 20:15:28.530127081 +0000 UTC m=+21.701827162" Apr 17 20:15:28.530717 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.530678 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vhbpf" podStartSLOduration=4.176109723 podStartE2EDuration="21.530662283s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.216663008 +0000 UTC m=+3.388363085" lastFinishedPulling="2026-04-17 20:15:27.571215574 +0000 UTC m=+20.742915645" observedRunningTime="2026-04-17 20:15:28.513945994 +0000 UTC m=+21.685646075" watchObservedRunningTime="2026-04-17 20:15:28.530662283 +0000 UTC m=+21.702362365" Apr 17 20:15:28.543030 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:28.542987 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kjf7q" podStartSLOduration=4.214167515 podStartE2EDuration="21.542973122s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.24241228 +0000 UTC m=+3.414112342" lastFinishedPulling="2026-04-17 20:15:27.571217883 +0000 UTC m=+20.742917949" observedRunningTime="2026-04-17 20:15:28.542801019 +0000 UTC m=+21.714501099" watchObservedRunningTime="2026-04-17 20:15:28.542973122 +0000 UTC m=+21.714673202" Apr 17 20:15:29.187193 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.187166 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:15:29.308615 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.308518 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:15:29.187187916Z","UUID":"4b7eb95a-5d6c-4b4f-8891-496c0cabb399","Handler":null,"Name":"","Endpoint":""} Apr 17 20:15:29.310387 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.310366 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:15:29.310387 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.310393 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:15:29.334977 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.334873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:29.335130 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:29.335023 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:29.445096 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.445058 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"89452aa32d1575c6667cb67cd2d039bd34e989026a74278ba6b68cabc23a5545"} Apr 17 20:15:29.445096 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.445100 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"cb9c884491d7b307c25e91660413bff7b39b033d7b2cbebc297cf58300daf1a5"} Apr 17 20:15:29.445810 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.445110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"9d74c2d2d26d34b899e669c1b7f997b123e213020f60011da9540dc0b1df074b"} Apr 17 20:15:29.446350 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.446324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k2w9b" event={"ID":"da978ea1-70f9-45a3-9f4e-e26cc21544b2","Type":"ContainerStarted","Data":"04cc0b2519a43f660ae1c290cc6b15ccae95904317853af4da611281a7a64795"} Apr 17 20:15:29.447865 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.447833 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" event={"ID":"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea","Type":"ContainerStarted","Data":"eaab69d577ae980a741a94ef5de8ca5516b4c36666e78619f6bbe427e9b6d88c"} Apr 17 20:15:29.459096 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:29.459049 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k2w9b" podStartSLOduration=5.133580543 podStartE2EDuration="22.459034977s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.24795625 +0000 UTC m=+3.419656313" lastFinishedPulling="2026-04-17 20:15:27.573410683 +0000 UTC m=+20.745110747" observedRunningTime="2026-04-17 20:15:29.458734356 +0000 UTC m=+22.630434434" watchObservedRunningTime="2026-04-17 20:15:29.459034977 +0000 UTC m=+22.630735059" Apr 17 20:15:30.334645 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:30.334604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:30.334975 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:30.334739 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:30.620031 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:30.619998 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:30.620928 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:30.620908 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:31.053213 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.053009 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:31.054005 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.053984 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vhbpf" Apr 17 20:15:31.061208 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.061187 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-92pvt"] Apr 17 20:15:31.089452 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.089426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.089580 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:31.089515 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-92pvt" podUID="30a5e026-7834-4983-8c72-94991eb23377" Apr 17 20:15:31.218182 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.218089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/30a5e026-7834-4983-8c72-94991eb23377-kubelet-config\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.218182 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.218142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/30a5e026-7834-4983-8c72-94991eb23377-dbus\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.218404 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.218252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.318746 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.318702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.318971 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.318758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/30a5e026-7834-4983-8c72-94991eb23377-kubelet-config\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.318971 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.318828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/30a5e026-7834-4983-8c72-94991eb23377-kubelet-config\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.318971 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:31.318875 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:31.318971 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:31.318957 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret podName:30a5e026-7834-4983-8c72-94991eb23377 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:31.81894315 +0000 UTC m=+24.990643213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret") pod "global-pull-secret-syncer-92pvt" (UID: "30a5e026-7834-4983-8c72-94991eb23377") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:31.319152 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.318893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/30a5e026-7834-4983-8c72-94991eb23377-dbus\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.319152 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.319075 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/30a5e026-7834-4983-8c72-94991eb23377-dbus\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.335388 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.335359 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:31.335598 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:31.335494 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:31.453450 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.453411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" event={"ID":"968f9d19-7fd2-4b4b-9b77-d24caa01a1ea","Type":"ContainerStarted","Data":"490119f55c630d1e4a4ffa5185523bb5e80c8549ca7b63170c3b5631e6ea8e0f"} Apr 17 20:15:31.457474 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.456774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"a0307cd846833cacb85801d0b73e4135032c1dc28630a51079759b316ed8aca3"} Apr 17 20:15:31.470765 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.470663 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sprdw" podStartSLOduration=4.165615503 podStartE2EDuration="24.47064901s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.214268579 +0000 UTC m=+3.385968644" lastFinishedPulling="2026-04-17 20:15:30.519302076 +0000 UTC m=+23.691002151" observedRunningTime="2026-04-17 20:15:31.469956298 +0000 UTC m=+24.641656374" watchObservedRunningTime="2026-04-17 20:15:31.47064901 +0000 UTC m=+24.642349091" Apr 17 20:15:31.822751 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:31.822701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:31.823250 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:31.822894 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:31.823250 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:31.822981 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret podName:30a5e026-7834-4983-8c72-94991eb23377 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:32.822960392 +0000 UTC m=+25.994660451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret") pod "global-pull-secret-syncer-92pvt" (UID: "30a5e026-7834-4983-8c72-94991eb23377") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:32.335077 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:32.335039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:32.335265 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:32.335163 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:32.831243 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:32.831190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:32.831720 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:32.831360 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:32.831720 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:32.831430 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret podName:30a5e026-7834-4983-8c72-94991eb23377 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:34.831410033 +0000 UTC m=+28.003110092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret") pod "global-pull-secret-syncer-92pvt" (UID: "30a5e026-7834-4983-8c72-94991eb23377") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:33.334980 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:33.334941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:33.334980 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:33.334972 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:33.335202 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:33.335087 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:33.335264 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:33.335219 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-92pvt" podUID="30a5e026-7834-4983-8c72-94991eb23377" Apr 17 20:15:34.334645 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.334614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:34.335073 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:34.334735 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:34.465800 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.465771 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" event={"ID":"fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26","Type":"ContainerStarted","Data":"7478f30edd861d73bbd24743ed78c6b959708dbd1a0c00aa7520d272dfd7a2f8"} Apr 17 20:15:34.466129 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.466098 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:34.466253 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.466142 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:34.481479 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.481445 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:34.481584 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.481517 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:34.489125 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.489086 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" podStartSLOduration=9.802614918 podStartE2EDuration="27.489073863s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.248194062 +0000 UTC m=+3.419894121" lastFinishedPulling="2026-04-17 20:15:27.93465299 +0000 UTC m=+21.106353066" observedRunningTime="2026-04-17 20:15:34.488845913 +0000 UTC m=+27.660545996" watchObservedRunningTime="2026-04-17 20:15:34.489073863 +0000 UTC m=+27.660773943" Apr 17 20:15:34.845561 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:34.845520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:34.845717 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:34.845671 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:34.845775 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:34.845747 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret podName:30a5e026-7834-4983-8c72-94991eb23377 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:38.845725768 +0000 UTC m=+32.017425828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret") pod "global-pull-secret-syncer-92pvt" (UID: "30a5e026-7834-4983-8c72-94991eb23377") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:35.335383 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:35.335341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:35.335788 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:35.335394 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:35.335788 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:35.335487 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-92pvt" podUID="30a5e026-7834-4983-8c72-94991eb23377" Apr 17 20:15:35.335788 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:35.335598 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:35.469695 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:35.469654 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b1dd3fa-14f5-434c-919f-b8ceabccf2b6" containerID="e2ffeb93285ae44d1b3a037dce7a67e1aa072bef680db334a5d77f2431d27099" exitCode=0 Apr 17 20:15:35.469954 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:35.469938 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 20:15:35.471617 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:35.471586 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerDied","Data":"e2ffeb93285ae44d1b3a037dce7a67e1aa072bef680db334a5d77f2431d27099"} Apr 17 20:15:36.253615 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.253577 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:15:36.334853 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.334818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:36.335071 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:36.334975 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:36.382630 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.382597 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z2fcw"] Apr 17 20:15:36.385783 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.385754 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-92pvt"] Apr 17 20:15:36.385989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.385871 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:36.386060 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:36.386017 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-92pvt" podUID="30a5e026-7834-4983-8c72-94991eb23377" Apr 17 20:15:36.386632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.386411 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pwx7f"] Apr 17 20:15:36.386632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.386515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:36.386828 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:36.386630 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:36.471554 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:36.471526 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:36.471730 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:36.471632 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:37.475217 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:37.475120 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b1dd3fa-14f5-434c-919f-b8ceabccf2b6" containerID="2a4a9a5892d84cb52ca9d56b89a1cffd2c8630d131f08b6814fdba99dcf8b073" exitCode=0 Apr 17 20:15:37.475645 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:37.475206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerDied","Data":"2a4a9a5892d84cb52ca9d56b89a1cffd2c8630d131f08b6814fdba99dcf8b073"} Apr 17 20:15:38.334537 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:38.334333 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:38.334723 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:38.334332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:38.334723 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:38.334637 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:38.334723 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:38.334685 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-92pvt" podUID="30a5e026-7834-4983-8c72-94991eb23377" Apr 17 20:15:38.334723 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:38.334361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:38.334940 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:38.334828 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:38.876739 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:38.876710 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:38.877074 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:38.876822 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:38.877074 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:38.876873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret podName:30a5e026-7834-4983-8c72-94991eb23377 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:46.876860657 +0000 UTC m=+40.048560715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret") pod "global-pull-secret-syncer-92pvt" (UID: "30a5e026-7834-4983-8c72-94991eb23377") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:15:39.480944 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:39.480835 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b1dd3fa-14f5-434c-919f-b8ceabccf2b6" containerID="515eb9ac08bb00d20410f028075c09b2775c7f482eb2df6a7a9997e909279bf6" exitCode=0 Apr 17 20:15:39.480944 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:39.480911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerDied","Data":"515eb9ac08bb00d20410f028075c09b2775c7f482eb2df6a7a9997e909279bf6"} Apr 17 20:15:40.335141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.335104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:40.335555 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.335104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:40.335555 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.335104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:40.335555 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.335279 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:15:40.335555 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.335390 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-92pvt" podUID="30a5e026-7834-4983-8c72-94991eb23377" Apr 17 20:15:40.335555 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.335462 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2fcw" podUID="261e1de3-1829-4520-b7ef-6bb874d9f16e" Apr 17 20:15:40.674915 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.674819 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-50.ec2.internal" event="NodeReady" Apr 17 20:15:40.675082 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.674978 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:15:40.706569 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.706534 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl"] Apr 17 20:15:40.721675 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.721645 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m"] Apr 17 20:15:40.721908 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.721861 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.726291 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.726126 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 20:15:40.726291 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.726151 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-bkvlp\"" Apr 17 20:15:40.726291 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.726163 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 20:15:40.726555 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.726322 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 20:15:40.727085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.727062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 20:15:40.736917 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.736416 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-779c7964f8-9mdv7"] Apr 17 20:15:40.736917 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.736571 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.738689 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.738664 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 20:15:40.739178 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.739155 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 20:15:40.739289 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.739200 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 20:15:40.739353 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.739298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 20:15:40.742289 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.742268 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5"] Apr 17 20:15:40.742463 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.742439 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.745910 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.744516 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tkw2s\"" Apr 17 20:15:40.745910 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.744543 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:15:40.745910 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.744562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:15:40.746616 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.746341 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:15:40.750947 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.750928 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:15:40.753307 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.753286 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl"] Apr 17 20:15:40.753396 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.753318 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m"] Apr 17 20:15:40.753396 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.753334 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779c7964f8-9mdv7"] Apr 17 20:15:40.753396 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.753347 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qzgfk"] Apr 17 20:15:40.753616 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.753591 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.755520 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.755501 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 20:15:40.761563 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.761545 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5"] Apr 17 20:15:40.761667 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.761571 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qzgfk"] Apr 17 20:15:40.761725 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.761668 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.763964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.763936 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:15:40.763964 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.763959 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x9tzj\"" Apr 17 20:15:40.764111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.763941 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:15:40.790038 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.790012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sg9x\" (UniqueName: \"kubernetes.io/projected/340f32d4-9729-453c-b1ff-b7f35ba6d9ec-kube-api-access-8sg9x\") pod \"managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl\" (UID: \"340f32d4-9729-453c-b1ff-b7f35ba6d9ec\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.790183 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.790069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/340f32d4-9729-453c-b1ff-b7f35ba6d9ec-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl\" (UID: \"340f32d4-9729-453c-b1ff-b7f35ba6d9ec\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.824633 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.824593 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-thggh"] Apr 17 20:15:40.838869 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.838842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-thggh"] Apr 17 20:15:40.839038 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.838986 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:40.841160 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.841137 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:15:40.841345 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.841194 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:15:40.841345 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.841249 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:15:40.841345 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.841314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ndtv9\"" Apr 17 20:15:40.891121 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-ca\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.891294 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sg9x\" (UniqueName: \"kubernetes.io/projected/340f32d4-9729-453c-b1ff-b7f35ba6d9ec-kube-api-access-8sg9x\") pod \"managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl\" (UID: \"340f32d4-9729-453c-b1ff-b7f35ba6d9ec\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.891294 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891199 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.891294 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891235 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.891294 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891487 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891328 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.891487 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/340f32d4-9729-453c-b1ff-b7f35ba6d9ec-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl\" (UID: \"340f32d4-9729-453c-b1ff-b7f35ba6d9ec\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.891487 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ltz\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-kube-api-access-h5ltz\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891487 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-config-volume\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.891651 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/021577ca-5399-48fc-bacd-f0800327517d-tmp\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.891651 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/021577ca-5399-48fc-bacd-f0800327517d-klusterlet-config\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.891651 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891541 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8e718c17-e639-4044-b14e-a421084115db-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.891651 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xffcl\" (UniqueName: \"kubernetes.io/projected/8e718c17-e639-4044-b14e-a421084115db-kube-api-access-xffcl\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.891651 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nhm\" (UniqueName: \"kubernetes.io/projected/021577ca-5399-48fc-bacd-f0800327517d-kube-api-access-d4nhm\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.891651 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-installation-pull-secrets\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-tmp-dir\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-trusted-ca\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-bound-sa-token\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891747 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-certificates\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891767 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgt7f\" (UniqueName: \"kubernetes.io/projected/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-kube-api-access-lgt7f\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-hub\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d21b354-7081-4159-8dfa-77c0c4704b30-ca-trust-extracted\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.891870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.891857 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-image-registry-private-configuration\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.898382 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.898362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sg9x\" (UniqueName: \"kubernetes.io/projected/340f32d4-9729-453c-b1ff-b7f35ba6d9ec-kube-api-access-8sg9x\") pod \"managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl\" (UID: \"340f32d4-9729-453c-b1ff-b7f35ba6d9ec\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.902139 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.902112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/340f32d4-9729-453c-b1ff-b7f35ba6d9ec-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl\" (UID: \"340f32d4-9729-453c-b1ff-b7f35ba6d9ec\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:40.992394 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992306 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ltz\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-kube-api-access-h5ltz\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.992394 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-config-volume\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.992394 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/021577ca-5399-48fc-bacd-f0800327517d-tmp\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.992394 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/021577ca-5399-48fc-bacd-f0800327517d-klusterlet-config\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8e718c17-e639-4044-b14e-a421084115db-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xffcl\" (UniqueName: \"kubernetes.io/projected/8e718c17-e639-4044-b14e-a421084115db-kube-api-access-xffcl\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4nhm\" (UniqueName: \"kubernetes.io/projected/021577ca-5399-48fc-bacd-f0800327517d-kube-api-access-d4nhm\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-installation-pull-secrets\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-tmp-dir\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-trusted-ca\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-bound-sa-token\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992612 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-certificates\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgt7f\" (UniqueName: \"kubernetes.io/projected/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-kube-api-access-lgt7f\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:40.992743 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-hub\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d21b354-7081-4159-8dfa-77c0c4704b30-ca-trust-extracted\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-image-registry-private-configuration\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-ca\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.992992 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb95n\" (UniqueName: \"kubernetes.io/projected/84435488-0f4e-4e67-98ca-b76e36c0b583-kube-api-access-jb95n\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993006 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-tmp-dir\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-config-volume\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993111 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993151 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993165 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vp2w5 for pod openshift-network-diagnostics/network-check-target-z2fcw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993191 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993225 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5 podName:261e1de3-1829-4520-b7ef-6bb874d9f16e nodeName:}" failed. No retries permitted until 2026-04-17 20:16:12.993207652 +0000 UTC m=+66.164907713 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vp2w5" (UniqueName: "kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5") pod "network-check-target-z2fcw" (UID: "261e1de3-1829-4520-b7ef-6bb874d9f16e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:15:40.993386 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993234 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/021577ca-5399-48fc-bacd-f0800327517d-tmp\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993651 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993667 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993733 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-certificates\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.993865 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-trusted-ca\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993250 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:41.49323392 +0000 UTC m=+34.664933984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993956 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:12.993936729 +0000 UTC m=+66.165636794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:40.993974 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:41.493963949 +0000 UTC m=+34.665664040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:15:40.994433 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.994425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d21b354-7081-4159-8dfa-77c0c4704b30-ca-trust-extracted\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.995993 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.995957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-installation-pull-secrets\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.996117 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.996014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/021577ca-5399-48fc-bacd-f0800327517d-klusterlet-config\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:40.996206 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.996186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-hub\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.996530 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.996506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.996751 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.996731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.996914 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.996868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-image-registry-private-configuration\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:40.997051 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.997035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8e718c17-e639-4044-b14e-a421084115db-ca\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:40.998558 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:40.998536 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8e718c17-e639-4044-b14e-a421084115db-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:41.005270 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.005245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ltz\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-kube-api-access-h5ltz\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:41.005520 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.005496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-bound-sa-token\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:41.005775 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.005739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xffcl\" (UniqueName: \"kubernetes.io/projected/8e718c17-e639-4044-b14e-a421084115db-kube-api-access-xffcl\") pod \"cluster-proxy-proxy-agent-64cb777964-fxh4m\" (UID: \"8e718c17-e639-4044-b14e-a421084115db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:41.006187 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.006157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgt7f\" (UniqueName: \"kubernetes.io/projected/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-kube-api-access-lgt7f\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:41.007921 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.007902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4nhm\" (UniqueName: \"kubernetes.io/projected/021577ca-5399-48fc-bacd-f0800327517d-kube-api-access-d4nhm\") pod \"klusterlet-addon-workmgr-654f86b6b4-nqtf5\" (UID: \"021577ca-5399-48fc-bacd-f0800327517d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:41.042140 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.042113 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" Apr 17 20:15:41.053021 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.052988 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:15:41.077591 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.077226 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:41.094768 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.093970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jb95n\" (UniqueName: \"kubernetes.io/projected/84435488-0f4e-4e67-98ca-b76e36c0b583-kube-api-access-jb95n\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:41.094768 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.094079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:41.094768 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.094231 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:15:41.094768 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.094311 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:41.594285305 +0000 UTC m=+34.765985372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:15:41.105208 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.105176 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb95n\" (UniqueName: \"kubernetes.io/projected/84435488-0f4e-4e67-98ca-b76e36c0b583-kube-api-access-jb95n\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:41.212605 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.212572 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl"] Apr 17 20:15:41.215341 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.215304 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m"] Apr 17 20:15:41.218117 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:41.218083 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod340f32d4_9729_453c_b1ff_b7f35ba6d9ec.slice/crio-40990e8f887e5336c98c0e5b03cca6618a8b6fe2d4f98f215c5a101d48c8a5ed WatchSource:0}: Error finding container 40990e8f887e5336c98c0e5b03cca6618a8b6fe2d4f98f215c5a101d48c8a5ed: Status 404 returned error can't find the container with id 40990e8f887e5336c98c0e5b03cca6618a8b6fe2d4f98f215c5a101d48c8a5ed Apr 17 20:15:41.219653 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:41.219608 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e718c17_e639_4044_b14e_a421084115db.slice/crio-5acf988a501c79e21b430c60ecfc43a860bef717308e598846f6e10770057da8 WatchSource:0}: Error finding container 5acf988a501c79e21b430c60ecfc43a860bef717308e598846f6e10770057da8: Status 404 returned error can't find the container with id 5acf988a501c79e21b430c60ecfc43a860bef717308e598846f6e10770057da8 Apr 17 20:15:41.240128 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.240099 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5"] Apr 17 20:15:41.243793 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:15:41.243728 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod021577ca_5399_48fc_bacd_f0800327517d.slice/crio-0ce851f5a5200cc03c544b5cbfa3f036675872feba5fd5db1669352209fbd85d WatchSource:0}: Error finding container 0ce851f5a5200cc03c544b5cbfa3f036675872feba5fd5db1669352209fbd85d: Status 404 returned error can't find the container with id 0ce851f5a5200cc03c544b5cbfa3f036675872feba5fd5db1669352209fbd85d Apr 17 20:15:41.485935 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.485868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" event={"ID":"021577ca-5399-48fc-bacd-f0800327517d","Type":"ContainerStarted","Data":"0ce851f5a5200cc03c544b5cbfa3f036675872feba5fd5db1669352209fbd85d"} Apr 17 20:15:41.487127 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.487085 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" event={"ID":"8e718c17-e639-4044-b14e-a421084115db","Type":"ContainerStarted","Data":"5acf988a501c79e21b430c60ecfc43a860bef717308e598846f6e10770057da8"} Apr 17 20:15:41.488189 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.488158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" event={"ID":"340f32d4-9729-453c-b1ff-b7f35ba6d9ec","Type":"ContainerStarted","Data":"40990e8f887e5336c98c0e5b03cca6618a8b6fe2d4f98f215c5a101d48c8a5ed"} Apr 17 20:15:41.498597 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.498535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:41.498597 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.498577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:41.498738 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.498677 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:15:41.498738 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.498693 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:15:41.498738 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.498714 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:15:41.498826 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.498746 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:42.498729811 +0000 UTC m=+35.670429869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:15:41.498826 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.498760 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:42.498752768 +0000 UTC m=+35.670452827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:15:41.599438 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:41.599395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:41.599639 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.599578 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:15:41.599682 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:41.599644 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:42.599623207 +0000 UTC m=+35.771323286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:15:42.335284 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.335245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:15:42.335744 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.335726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:42.336196 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.336174 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:15:42.340066 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.340040 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:15:42.342482 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.342457 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:15:42.342728 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.342711 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8drw\"" Apr 17 20:15:42.342967 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.342953 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:15:42.343241 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.343225 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:15:42.343434 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.343419 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rfhz5\"" Apr 17 20:15:42.508470 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.508431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:42.508962 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.508596 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:42.508962 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.508601 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:15:42.508962 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.508678 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:44.508656403 +0000 UTC m=+37.680356482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:15:42.508962 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.508705 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:15:42.508962 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.508720 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:15:42.508962 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.508798 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:44.508781881 +0000 UTC m=+37.680481957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:15:42.610590 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:42.609819 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:42.610590 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.610036 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:15:42.610590 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:42.610104 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:44.610082165 +0000 UTC m=+37.781782247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:15:44.530013 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:44.529727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:44.530496 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:44.530070 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:44.530496 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.529943 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:15:44.530496 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.530108 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:15:44.530496 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.530257 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:15:44.530496 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.530262 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:48.530223928 +0000 UTC m=+41.701923999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:15:44.530496 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.530311 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:48.530294688 +0000 UTC m=+41.701994760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:15:44.631092 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:44.631028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:44.631292 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.631202 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:15:44.631292 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:44.631266 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:48.631247508 +0000 UTC m=+41.802947567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:15:46.951022 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:46.950980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:46.954584 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:46.954540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/30a5e026-7834-4983-8c72-94991eb23377-original-pull-secret\") pod \"global-pull-secret-syncer-92pvt\" (UID: \"30a5e026-7834-4983-8c72-94991eb23377\") " pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:47.167382 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:47.167344 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-92pvt" Apr 17 20:15:48.564848 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:48.564808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:48.565291 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:48.564871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:48.565291 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.565009 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:15:48.565291 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.565035 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:15:48.565291 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.565100 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:56.565082056 +0000 UTC m=+49.736782116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:15:48.565291 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.565009 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:15:48.565291 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.565139 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:56.565128351 +0000 UTC m=+49.736828423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:15:48.665620 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:48.665577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:48.665807 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.665670 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:15:48.665807 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:48.665736 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:15:56.665715185 +0000 UTC m=+49.837415284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:15:49.193766 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.193731 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-92pvt"] Apr 17 20:15:49.511209 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.511168 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b1dd3fa-14f5-434c-919f-b8ceabccf2b6" containerID="50f7575e38e3a489d15c3e4563b2fe2062a4a30d91e7b55d821a01f2399942d3" exitCode=0 Apr 17 20:15:49.511391 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.511253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerDied","Data":"50f7575e38e3a489d15c3e4563b2fe2062a4a30d91e7b55d821a01f2399942d3"} Apr 17 20:15:49.512712 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.512686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" event={"ID":"340f32d4-9729-453c-b1ff-b7f35ba6d9ec","Type":"ContainerStarted","Data":"ec4ac4642a9b639013693d559b9487955785781cf717a015e5cecea8b19c4df5"} Apr 17 20:15:49.514047 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.514015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" event={"ID":"8e718c17-e639-4044-b14e-a421084115db","Type":"ContainerStarted","Data":"1835fcd82dc089a75bfc46a302445608c2a7278470addc394540fefa0ec23690"} Apr 17 20:15:49.515139 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.515076 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-92pvt" event={"ID":"30a5e026-7834-4983-8c72-94991eb23377","Type":"ContainerStarted","Data":"1a76a54f23a5056f6e82f5e3260061830e552d1245cbf433256b5485fb910f73"} Apr 17 20:15:49.516462 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.516439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" event={"ID":"021577ca-5399-48fc-bacd-f0800327517d","Type":"ContainerStarted","Data":"43d76f4ed688e086aec14b56ec19bad448af6f3977b7bc6e90a2df011dd20f16"} Apr 17 20:15:49.516686 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.516670 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:49.518397 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.518379 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:15:49.546266 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:49.546205 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" podStartSLOduration=11.752117184 podStartE2EDuration="19.546184736s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:41.221282824 +0000 UTC m=+34.392982887" lastFinishedPulling="2026-04-17 20:15:49.015350361 +0000 UTC m=+42.187050439" observedRunningTime="2026-04-17 20:15:49.545788776 +0000 UTC m=+42.717488887" watchObservedRunningTime="2026-04-17 20:15:49.546184736 +0000 UTC m=+42.717884817" Apr 17 20:15:50.522204 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:50.522158 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b1dd3fa-14f5-434c-919f-b8ceabccf2b6" containerID="efa173a44e0ec4dc4a516bc93d5c9f9cac4691966200ba586bac41958eb10915" exitCode=0 Apr 17 20:15:50.522635 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:50.522245 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerDied","Data":"efa173a44e0ec4dc4a516bc93d5c9f9cac4691966200ba586bac41958eb10915"} Apr 17 20:15:50.545845 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:50.545474 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" podStartSLOduration=11.754095981 podStartE2EDuration="19.545455061s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="2026-04-17 20:15:41.245764506 +0000 UTC m=+34.417464566" lastFinishedPulling="2026-04-17 20:15:49.037123574 +0000 UTC m=+42.208823646" observedRunningTime="2026-04-17 20:15:49.563073715 +0000 UTC m=+42.734773822" watchObservedRunningTime="2026-04-17 20:15:50.545455061 +0000 UTC m=+43.717155142" Apr 17 20:15:51.527904 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:51.527853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" event={"ID":"5b1dd3fa-14f5-434c-919f-b8ceabccf2b6","Type":"ContainerStarted","Data":"7180e271459265e5de779a4f09cfc66b7e8aa30d96cdaa49ca49a4161afac41d"} Apr 17 20:15:51.548774 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:51.548713 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ks7qb" podStartSLOduration=5.715407323 podStartE2EDuration="44.548694961s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:15:10.212498517 +0000 UTC m=+3.384198594" lastFinishedPulling="2026-04-17 20:15:49.045785971 +0000 UTC m=+42.217486232" observedRunningTime="2026-04-17 20:15:51.548267561 +0000 UTC m=+44.719967642" watchObservedRunningTime="2026-04-17 20:15:51.548694961 +0000 UTC m=+44.720395044" Apr 17 20:15:52.531833 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:52.531789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" event={"ID":"8e718c17-e639-4044-b14e-a421084115db","Type":"ContainerStarted","Data":"c01b262f08664cd6a6b4082f7a8d5f612c26c58ed8605df5807d1c2ce495b1bf"} Apr 17 20:15:52.531833 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:52.531834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" event={"ID":"8e718c17-e639-4044-b14e-a421084115db","Type":"ContainerStarted","Data":"3f44981e129568fd10e0a7b18d919465db498bee3126e0140b1b0f2e8c1432cd"} Apr 17 20:15:52.549139 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:52.549072 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" podStartSLOduration=12.160908943999999 podStartE2EDuration="22.549049443s" podCreationTimestamp="2026-04-17 20:15:30 +0000 UTC" firstStartedPulling="2026-04-17 20:15:41.221527835 +0000 UTC m=+34.393227911" lastFinishedPulling="2026-04-17 20:15:51.609668344 +0000 UTC m=+44.781368410" observedRunningTime="2026-04-17 20:15:52.547797584 +0000 UTC m=+45.719497679" watchObservedRunningTime="2026-04-17 20:15:52.549049443 +0000 UTC m=+45.720749544" Apr 17 20:15:54.542703 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:54.542665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-92pvt" event={"ID":"30a5e026-7834-4983-8c72-94991eb23377","Type":"ContainerStarted","Data":"06bb57ba26ab1525428c96b579f2afe4e3a203901c8fbf7cf2aad1aabe335ad7"} Apr 17 20:15:54.555487 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:54.555430 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-92pvt" podStartSLOduration=18.717332528 podStartE2EDuration="23.555412493s" podCreationTimestamp="2026-04-17 20:15:31 +0000 UTC" firstStartedPulling="2026-04-17 20:15:49.200076246 +0000 UTC m=+42.371776307" lastFinishedPulling="2026-04-17 20:15:54.038156213 +0000 UTC m=+47.209856272" observedRunningTime="2026-04-17 20:15:54.55474124 +0000 UTC m=+47.726441322" watchObservedRunningTime="2026-04-17 20:15:54.555412493 +0000 UTC m=+47.727112577" Apr 17 20:15:56.633155 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:56.633113 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:15:56.633155 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:56.633162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:15:56.633694 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.633271 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:15:56.633694 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.633284 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:15:56.633694 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.633311 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:15:56.633694 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.633323 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:12.633309913 +0000 UTC m=+65.805009986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:15:56.633694 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.633367 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:12.63334889 +0000 UTC m=+65.805048969 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:15:56.733481 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:15:56.733437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:15:56.733621 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.733583 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:15:56.733683 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:15:56.733653 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:12.733636801 +0000 UTC m=+65.905336864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:16:07.486954 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:07.486922 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmwcg" Apr 17 20:16:12.645627 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:12.645580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:16:12.646061 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:12.645684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:16:12.646061 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.645737 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:12.646061 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.645774 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:16:12.646061 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.645784 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:16:12.646061 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.645813 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:44.645795151 +0000 UTC m=+97.817495210 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:16:12.646061 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.645827 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:44.645820781 +0000 UTC m=+97.817520839 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:16:12.746872 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:12.746828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:16:12.747107 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.746984 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:12.747107 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:12.747048 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:16:44.747031336 +0000 UTC m=+97.918731395 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:16:13.049766 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.049732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:16:13.049960 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.049807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:16:13.052015 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.051999 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:16:13.052072 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.052002 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:16:13.060750 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:13.060721 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:16:13.060850 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:13.060798 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:17.060776932 +0000 UTC m=+130.232477004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : secret "metrics-daemon-secret" not found Apr 17 20:16:13.062407 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.062388 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:16:13.073930 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.073901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp2w5\" (UniqueName: \"kubernetes.io/projected/261e1de3-1829-4520-b7ef-6bb874d9f16e-kube-api-access-vp2w5\") pod \"network-check-target-z2fcw\" (UID: \"261e1de3-1829-4520-b7ef-6bb874d9f16e\") " pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:16:13.257316 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.257287 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8drw\"" Apr 17 20:16:13.265971 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.265947 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:16:13.382580 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.382537 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z2fcw"] Apr 17 20:16:13.385652 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:16:13.385614 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261e1de3_1829_4520_b7ef_6bb874d9f16e.slice/crio-ece9b211d9e8232a4c1c9acca310677e8ec7afb47f14af95298cc81807c4a2be WatchSource:0}: Error finding container ece9b211d9e8232a4c1c9acca310677e8ec7afb47f14af95298cc81807c4a2be: Status 404 returned error can't find the container with id ece9b211d9e8232a4c1c9acca310677e8ec7afb47f14af95298cc81807c4a2be Apr 17 20:16:13.591482 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:13.591369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z2fcw" event={"ID":"261e1de3-1829-4520-b7ef-6bb874d9f16e","Type":"ContainerStarted","Data":"ece9b211d9e8232a4c1c9acca310677e8ec7afb47f14af95298cc81807c4a2be"} Apr 17 20:16:16.600773 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:16.600728 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z2fcw" event={"ID":"261e1de3-1829-4520-b7ef-6bb874d9f16e","Type":"ContainerStarted","Data":"fbc6f053914b769d7720b60b570be63e29f52a3db1e3e970573d0fab2f41dd35"} Apr 17 20:16:16.601168 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:16.600848 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:16:16.615200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:16.615145 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z2fcw" podStartSLOduration=66.75608551 podStartE2EDuration="1m9.615129374s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:16:13.387664347 +0000 UTC m=+66.559364421" lastFinishedPulling="2026-04-17 20:16:16.246708226 +0000 UTC m=+69.418408285" observedRunningTime="2026-04-17 20:16:16.614088202 +0000 UTC m=+69.785788294" watchObservedRunningTime="2026-04-17 20:16:16.615129374 +0000 UTC m=+69.786829454" Apr 17 20:16:44.702447 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:44.702273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:16:44.702447 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:44.702339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:16:44.702447 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.702434 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:16:44.702447 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.702451 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:16:44.702447 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.702459 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779c7964f8-9mdv7: secret "image-registry-tls" not found Apr 17 20:16:44.703282 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.702517 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls podName:0ac23e99-190f-4c9d-80e2-1ba8d936a9f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:48.702499726 +0000 UTC m=+161.874199784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls") pod "dns-default-qzgfk" (UID: "0ac23e99-190f-4c9d-80e2-1ba8d936a9f1") : secret "dns-default-metrics-tls" not found Apr 17 20:16:44.703282 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.702536 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls podName:4d21b354-7081-4159-8dfa-77c0c4704b30 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:48.702527447 +0000 UTC m=+161.874227506 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls") pod "image-registry-779c7964f8-9mdv7" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30") : secret "image-registry-tls" not found Apr 17 20:16:44.803440 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:44.803398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:16:44.803603 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.803551 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:16:44.803652 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:16:44.803624 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert podName:84435488-0f4e-4e67-98ca-b76e36c0b583 nodeName:}" failed. No retries permitted until 2026-04-17 20:17:48.803605352 +0000 UTC m=+161.975305426 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert") pod "ingress-canary-thggh" (UID: "84435488-0f4e-4e67-98ca-b76e36c0b583") : secret "canary-serving-cert" not found Apr 17 20:16:47.606403 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:16:47.606372 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z2fcw" Apr 17 20:17:11.035745 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:11.035715 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kjf7q_eeee41e0-c56f-4312-97f4-258ec5fc4d4d/dns-node-resolver/0.log" Apr 17 20:17:12.036519 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:12.036488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n66bj_e2f38744-a4e3-4cc0-94fe-0190e5c2c772/node-ca/0.log" Apr 17 20:17:17.149591 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:17.149540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:17:17.150109 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:17:17.149571 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:17:17.150109 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:17:17.149710 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs podName:85885202-a740-4319-827b-236ded2de085 nodeName:}" failed. No retries permitted until 2026-04-17 20:19:19.149691617 +0000 UTC m=+252.321391676 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs") pod "network-metrics-daemon-pwx7f" (UID: "85885202-a740-4319-827b-236ded2de085") : secret "metrics-daemon-secret" not found Apr 17 20:17:30.355281 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.355237 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cqfpr"] Apr 17 20:17:30.358435 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.358411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.360867 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.360837 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:17:30.361726 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.361711 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:17:30.361726 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.361716 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-59lxf\"" Apr 17 20:17:30.361870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.361714 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:17:30.361870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.361714 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:17:30.369822 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.369795 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cqfpr"] Apr 17 20:17:30.448570 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.448527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctl6j\" (UniqueName: \"kubernetes.io/projected/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-kube-api-access-ctl6j\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.448570 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.448567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-data-volume\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.448796 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.448666 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-crio-socket\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.448796 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.448714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.448796 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.448761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550023 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.549986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550199 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctl6j\" (UniqueName: \"kubernetes.io/projected/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-kube-api-access-ctl6j\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550199 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-data-volume\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550283 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-crio-socket\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550283 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550377 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-crio-socket\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550516 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550498 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-data-volume\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.550689 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.550671 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.555900 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.552832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.560473 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.560450 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctl6j\" (UniqueName: \"kubernetes.io/projected/cd3ca34b-26ea-4674-bf57-67d8b016f2dd-kube-api-access-ctl6j\") pod \"insights-runtime-extractor-cqfpr\" (UID: \"cd3ca34b-26ea-4674-bf57-67d8b016f2dd\") " pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.667543 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.667456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cqfpr" Apr 17 20:17:30.787827 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:30.787788 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cqfpr"] Apr 17 20:17:30.790757 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:17:30.790730 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd3ca34b_26ea_4674_bf57_67d8b016f2dd.slice/crio-451cfb7f7a39c230a4a66d71cc2f34a5561013c9d5563bdb8232af4c2000c09f WatchSource:0}: Error finding container 451cfb7f7a39c230a4a66d71cc2f34a5561013c9d5563bdb8232af4c2000c09f: Status 404 returned error can't find the container with id 451cfb7f7a39c230a4a66d71cc2f34a5561013c9d5563bdb8232af4c2000c09f Apr 17 20:17:31.772473 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:31.772381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqfpr" event={"ID":"cd3ca34b-26ea-4674-bf57-67d8b016f2dd","Type":"ContainerStarted","Data":"27be154f91859ee3e5778124a22d662fb3ef0c2e6dffa519ef35eafcee9980f7"} Apr 17 20:17:31.772473 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:31.772420 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqfpr" event={"ID":"cd3ca34b-26ea-4674-bf57-67d8b016f2dd","Type":"ContainerStarted","Data":"2790914b52b36a7fc9b152a96ec72bbbef630a29d221d9f6acbcbe9cddfa9ad4"} Apr 17 20:17:31.772473 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:31.772429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqfpr" event={"ID":"cd3ca34b-26ea-4674-bf57-67d8b016f2dd","Type":"ContainerStarted","Data":"451cfb7f7a39c230a4a66d71cc2f34a5561013c9d5563bdb8232af4c2000c09f"} Apr 17 20:17:33.778839 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:33.778800 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqfpr" event={"ID":"cd3ca34b-26ea-4674-bf57-67d8b016f2dd","Type":"ContainerStarted","Data":"f905fe0b227120fa4e624e57eb839321407d2603abd663076ee5cfa783018ad0"} Apr 17 20:17:33.793988 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:33.793941 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cqfpr" podStartSLOduration=1.6907276530000002 podStartE2EDuration="3.793927201s" podCreationTimestamp="2026-04-17 20:17:30 +0000 UTC" firstStartedPulling="2026-04-17 20:17:30.847107149 +0000 UTC m=+144.018807223" lastFinishedPulling="2026-04-17 20:17:32.950306711 +0000 UTC m=+146.122006771" observedRunningTime="2026-04-17 20:17:33.79316169 +0000 UTC m=+146.964861762" watchObservedRunningTime="2026-04-17 20:17:33.793927201 +0000 UTC m=+146.965627283" Apr 17 20:17:43.768166 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:17:43.768099 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" Apr 17 20:17:43.782297 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:17:43.782263 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qzgfk" podUID="0ac23e99-190f-4c9d-80e2-1ba8d936a9f1" Apr 17 20:17:43.803170 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:43.803131 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:17:43.803340 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:43.803141 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qzgfk" Apr 17 20:17:43.849744 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:17:43.849707 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-thggh" podUID="84435488-0f4e-4e67-98ca-b76e36c0b583" Apr 17 20:17:45.376498 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:17:45.376451 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-pwx7f" podUID="85885202-a740-4319-827b-236ded2de085" Apr 17 20:17:46.324690 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.324657 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zmmdh"] Apr 17 20:17:46.330599 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.330567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.333002 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.332976 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:17:46.333153 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.332977 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kf58f\"" Apr 17 20:17:46.333585 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.333566 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:17:46.333682 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.333566 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:17:46.333682 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.333604 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:17:46.333682 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.333623 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:17:46.333682 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.333661 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:17:46.472935 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.472897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-wtmp\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.472993 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7dcbd5-6f69-40f5-85e8-13bd474164db-metrics-client-ca\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-root\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xkc\" (UniqueName: \"kubernetes.io/projected/0c7dcbd5-6f69-40f5-85e8-13bd474164db-kube-api-access-27xkc\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-textfile\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-sys\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473249 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-accelerators-collector-config\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.473363 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.473283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-tls\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574010 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.573972 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-wtmp\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574191 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7dcbd5-6f69-40f5-85e8-13bd474164db-metrics-client-ca\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574191 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-root\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574191 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-wtmp\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574356 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27xkc\" (UniqueName: \"kubernetes.io/projected/0c7dcbd5-6f69-40f5-85e8-13bd474164db-kube-api-access-27xkc\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574356 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-root\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574356 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574243 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-textfile\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574356 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-sys\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574356 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c7dcbd5-6f69-40f5-85e8-13bd474164db-sys\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574589 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574589 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-accelerators-collector-config\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574589 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-tls\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574589 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574543 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-textfile\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.574845 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.574794 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-accelerators-collector-config\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.575083 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.575062 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c7dcbd5-6f69-40f5-85e8-13bd474164db-metrics-client-ca\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.576688 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.576667 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.576991 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.576971 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0c7dcbd5-6f69-40f5-85e8-13bd474164db-node-exporter-tls\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.601367 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.601325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xkc\" (UniqueName: \"kubernetes.io/projected/0c7dcbd5-6f69-40f5-85e8-13bd474164db-kube-api-access-27xkc\") pod \"node-exporter-zmmdh\" (UID: \"0c7dcbd5-6f69-40f5-85e8-13bd474164db\") " pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.639788 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.639749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zmmdh" Apr 17 20:17:46.648970 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:17:46.648930 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7dcbd5_6f69_40f5_85e8_13bd474164db.slice/crio-8372398e4440335ebd1e0e6d961bdddea466861303adb05b8945b70f7d052b38 WatchSource:0}: Error finding container 8372398e4440335ebd1e0e6d961bdddea466861303adb05b8945b70f7d052b38: Status 404 returned error can't find the container with id 8372398e4440335ebd1e0e6d961bdddea466861303adb05b8945b70f7d052b38 Apr 17 20:17:46.810856 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:46.810815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmmdh" event={"ID":"0c7dcbd5-6f69-40f5-85e8-13bd474164db","Type":"ContainerStarted","Data":"8372398e4440335ebd1e0e6d961bdddea466861303adb05b8945b70f7d052b38"} Apr 17 20:17:47.814562 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:47.814528 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c7dcbd5-6f69-40f5-85e8-13bd474164db" containerID="fde86e94ca11f0e72e962a210b95af932e1ea333d6e83ee32c854446fbe8e592" exitCode=0 Apr 17 20:17:47.814965 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:47.814575 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmmdh" event={"ID":"0c7dcbd5-6f69-40f5-85e8-13bd474164db","Type":"ContainerDied","Data":"fde86e94ca11f0e72e962a210b95af932e1ea333d6e83ee32c854446fbe8e592"} Apr 17 20:17:48.794629 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.794584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:17:48.794917 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.794662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:17:48.797166 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.797137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ac23e99-190f-4c9d-80e2-1ba8d936a9f1-metrics-tls\") pod \"dns-default-qzgfk\" (UID: \"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1\") " pod="openshift-dns/dns-default-qzgfk" Apr 17 20:17:48.797289 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.797272 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"image-registry-779c7964f8-9mdv7\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:17:48.818842 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.818804 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmmdh" event={"ID":"0c7dcbd5-6f69-40f5-85e8-13bd474164db","Type":"ContainerStarted","Data":"af09dfb0e97ec6cd431f107a2a536cefd55cfb2b371bb977cd58499be9ea83c6"} Apr 17 20:17:48.819220 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.818850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmmdh" event={"ID":"0c7dcbd5-6f69-40f5-85e8-13bd474164db","Type":"ContainerStarted","Data":"a50dac82cb4b8351625477c0362da93e303b9368ede2340fdc46a89f5831c055"} Apr 17 20:17:48.838458 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.838404 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zmmdh" podStartSLOduration=1.902061633 podStartE2EDuration="2.838387228s" podCreationTimestamp="2026-04-17 20:17:46 +0000 UTC" firstStartedPulling="2026-04-17 20:17:46.65117466 +0000 UTC m=+159.822874722" lastFinishedPulling="2026-04-17 20:17:47.587500255 +0000 UTC m=+160.759200317" observedRunningTime="2026-04-17 20:17:48.837202929 +0000 UTC m=+162.008903033" watchObservedRunningTime="2026-04-17 20:17:48.838387228 +0000 UTC m=+162.010087309" Apr 17 20:17:48.896011 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.895975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:17:48.898528 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.898497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84435488-0f4e-4e67-98ca-b76e36c0b583-cert\") pod \"ingress-canary-thggh\" (UID: \"84435488-0f4e-4e67-98ca-b76e36c0b583\") " pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:17:48.906453 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.906428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x9tzj\"" Apr 17 20:17:48.906584 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.906428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tkw2s\"" Apr 17 20:17:48.915180 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.915153 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qzgfk" Apr 17 20:17:48.915274 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:48.915235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:17:49.063528 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.063446 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779c7964f8-9mdv7"] Apr 17 20:17:49.066041 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:17:49.066002 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d21b354_7081_4159_8dfa_77c0c4704b30.slice/crio-e53ae1afbdf895f6439041d01f111f531445c2d35083be4a19f3594cb184cfb5 WatchSource:0}: Error finding container e53ae1afbdf895f6439041d01f111f531445c2d35083be4a19f3594cb184cfb5: Status 404 returned error can't find the container with id e53ae1afbdf895f6439041d01f111f531445c2d35083be4a19f3594cb184cfb5 Apr 17 20:17:49.077420 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.077398 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qzgfk"] Apr 17 20:17:49.079824 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:17:49.079799 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac23e99_190f_4c9d_80e2_1ba8d936a9f1.slice/crio-9a11b06c18654abb2f4173bef835a9cbba7c7117899ce158e714e3446f21a428 WatchSource:0}: Error finding container 9a11b06c18654abb2f4173bef835a9cbba7c7117899ce158e714e3446f21a428: Status 404 returned error can't find the container with id 9a11b06c18654abb2f4173bef835a9cbba7c7117899ce158e714e3446f21a428 Apr 17 20:17:49.517829 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.517724 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" podUID="021577ca-5399-48fc-bacd-f0800327517d" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 17 20:17:49.823914 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.823808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" event={"ID":"4d21b354-7081-4159-8dfa-77c0c4704b30","Type":"ContainerStarted","Data":"225aaf7957df714fe517be686d11c965bde020a2ba83eceb06abbb7f62c5ea1c"} Apr 17 20:17:49.823914 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.823852 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" event={"ID":"4d21b354-7081-4159-8dfa-77c0c4704b30","Type":"ContainerStarted","Data":"e53ae1afbdf895f6439041d01f111f531445c2d35083be4a19f3594cb184cfb5"} Apr 17 20:17:49.824409 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.823942 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:17:49.825664 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.825637 2571 generic.go:358] "Generic (PLEG): container finished" podID="340f32d4-9729-453c-b1ff-b7f35ba6d9ec" containerID="ec4ac4642a9b639013693d559b9487955785781cf717a015e5cecea8b19c4df5" exitCode=255 Apr 17 20:17:49.825801 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.825718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" event={"ID":"340f32d4-9729-453c-b1ff-b7f35ba6d9ec","Type":"ContainerDied","Data":"ec4ac4642a9b639013693d559b9487955785781cf717a015e5cecea8b19c4df5"} Apr 17 20:17:49.826071 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.826049 2571 scope.go:117] "RemoveContainer" containerID="ec4ac4642a9b639013693d559b9487955785781cf717a015e5cecea8b19c4df5" Apr 17 20:17:49.827337 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.827314 2571 generic.go:358] "Generic (PLEG): container finished" podID="021577ca-5399-48fc-bacd-f0800327517d" containerID="43d76f4ed688e086aec14b56ec19bad448af6f3977b7bc6e90a2df011dd20f16" exitCode=1 Apr 17 20:17:49.827433 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.827374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" event={"ID":"021577ca-5399-48fc-bacd-f0800327517d","Type":"ContainerDied","Data":"43d76f4ed688e086aec14b56ec19bad448af6f3977b7bc6e90a2df011dd20f16"} Apr 17 20:17:49.827728 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.827653 2571 scope.go:117] "RemoveContainer" containerID="43d76f4ed688e086aec14b56ec19bad448af6f3977b7bc6e90a2df011dd20f16" Apr 17 20:17:49.828784 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.828765 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qzgfk" event={"ID":"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1","Type":"ContainerStarted","Data":"9a11b06c18654abb2f4173bef835a9cbba7c7117899ce158e714e3446f21a428"} Apr 17 20:17:49.843502 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:49.843451 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" podStartSLOduration=161.843432971 podStartE2EDuration="2m41.843432971s" podCreationTimestamp="2026-04-17 20:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:17:49.841157834 +0000 UTC m=+163.012857919" watchObservedRunningTime="2026-04-17 20:17:49.843432971 +0000 UTC m=+163.015133053" Apr 17 20:17:50.833411 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:50.833379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6566bd4dbf-5n2gl" event={"ID":"340f32d4-9729-453c-b1ff-b7f35ba6d9ec","Type":"ContainerStarted","Data":"4ca6eae5e7efbe21e72dd1c365709139173436bd0eaac69bac001b1d3c74487e"} Apr 17 20:17:50.835124 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:50.835098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" event={"ID":"021577ca-5399-48fc-bacd-f0800327517d","Type":"ContainerStarted","Data":"7876b96601c32b009e36857ac72b52ee1998b05ef08fd7ddf5670217ee54ed4a"} Apr 17 20:17:50.835393 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:50.835372 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:17:50.836072 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:50.836050 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-654f86b6b4-nqtf5" Apr 17 20:17:50.836543 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:50.836504 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qzgfk" event={"ID":"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1","Type":"ContainerStarted","Data":"b99a1f66a4917266e1aeabd40c89974afd86c0b1e690b2f0ca24f824c5e593f3"} Apr 17 20:17:51.841236 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:51.841183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qzgfk" event={"ID":"0ac23e99-190f-4c9d-80e2-1ba8d936a9f1","Type":"ContainerStarted","Data":"215d1e6b9ea5cbe276761a630d39f5cdace5964e68ad729152897883da9a2d89"} Apr 17 20:17:51.841585 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:51.841407 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qzgfk" Apr 17 20:17:51.859763 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:51.859710 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qzgfk" podStartSLOduration=130.317128904 podStartE2EDuration="2m11.859686795s" podCreationTimestamp="2026-04-17 20:15:40 +0000 UTC" firstStartedPulling="2026-04-17 20:17:49.081620401 +0000 UTC m=+162.253320460" lastFinishedPulling="2026-04-17 20:17:50.624178281 +0000 UTC m=+163.795878351" observedRunningTime="2026-04-17 20:17:51.85819468 +0000 UTC m=+165.029894760" watchObservedRunningTime="2026-04-17 20:17:51.859686795 +0000 UTC m=+165.031386875" Apr 17 20:17:55.335144 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:55.335058 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:17:55.337324 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:55.337292 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ndtv9\"" Apr 17 20:17:55.346006 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:55.345982 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-thggh" Apr 17 20:17:55.462931 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:55.462897 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-thggh"] Apr 17 20:17:55.466306 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:17:55.466280 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84435488_0f4e_4e67_98ca_b76e36c0b583.slice/crio-8129b36d4ebe330a253f2562a07f9f9888f6e8fb90bb073c995745bfd5170a6f WatchSource:0}: Error finding container 8129b36d4ebe330a253f2562a07f9f9888f6e8fb90bb073c995745bfd5170a6f: Status 404 returned error can't find the container with id 8129b36d4ebe330a253f2562a07f9f9888f6e8fb90bb073c995745bfd5170a6f Apr 17 20:17:55.854978 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:55.854941 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-thggh" event={"ID":"84435488-0f4e-4e67-98ca-b76e36c0b583","Type":"ContainerStarted","Data":"8129b36d4ebe330a253f2562a07f9f9888f6e8fb90bb073c995745bfd5170a6f"} Apr 17 20:17:57.861830 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:57.861793 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-thggh" event={"ID":"84435488-0f4e-4e67-98ca-b76e36c0b583","Type":"ContainerStarted","Data":"898d5011fbd30ea5f5407f9da9fca5f1a2d242acda586ac6986230839c539789"} Apr 17 20:17:57.875589 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:57.875532 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-thggh" podStartSLOduration=135.874745603 podStartE2EDuration="2m17.875515722s" podCreationTimestamp="2026-04-17 20:15:40 +0000 UTC" firstStartedPulling="2026-04-17 20:17:55.468685564 +0000 UTC m=+168.640385622" lastFinishedPulling="2026-04-17 20:17:57.469455663 +0000 UTC m=+170.641155741" observedRunningTime="2026-04-17 20:17:57.874723106 +0000 UTC m=+171.046423178" watchObservedRunningTime="2026-04-17 20:17:57.875515722 +0000 UTC m=+171.047215836" Apr 17 20:17:59.335110 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:17:59.335071 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:18:01.848065 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:01.848032 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qzgfk" Apr 17 20:18:08.919451 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:08.919416 2571 patch_prober.go:28] interesting pod/image-registry-779c7964f8-9mdv7 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 20:18:08.919855 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:08.919475 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:18:10.840924 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:10.840894 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:18:20.750533 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:20.750499 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779c7964f8-9mdv7"] Apr 17 20:18:31.055062 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:31.055010 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" podUID="8e718c17-e639-4044-b14e-a421084115db" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:18:39.217814 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:39.217780 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-779c7964f8-9mdv7_4d21b354-7081-4159-8dfa-77c0c4704b30/registry/0.log" Apr 17 20:18:39.227672 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:39.227643 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n66bj_e2f38744-a4e3-4cc0-94fe-0190e5c2c772/node-ca/0.log" Apr 17 20:18:41.054257 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:41.054216 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" podUID="8e718c17-e639-4044-b14e-a421084115db" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:18:45.769994 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:45.769936 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" containerName="registry" containerID="cri-o://225aaf7957df714fe517be686d11c965bde020a2ba83eceb06abbb7f62c5ea1c" gracePeriod=30 Apr 17 20:18:45.996479 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:45.996448 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d21b354-7081-4159-8dfa-77c0c4704b30" containerID="225aaf7957df714fe517be686d11c965bde020a2ba83eceb06abbb7f62c5ea1c" exitCode=0 Apr 17 20:18:45.996556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:45.996499 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" event={"ID":"4d21b354-7081-4159-8dfa-77c0c4704b30","Type":"ContainerDied","Data":"225aaf7957df714fe517be686d11c965bde020a2ba83eceb06abbb7f62c5ea1c"} Apr 17 20:18:45.996556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:45.996531 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" event={"ID":"4d21b354-7081-4159-8dfa-77c0c4704b30","Type":"ContainerDied","Data":"e53ae1afbdf895f6439041d01f111f531445c2d35083be4a19f3594cb184cfb5"} Apr 17 20:18:45.996556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:45.996545 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e53ae1afbdf895f6439041d01f111f531445c2d35083be4a19f3594cb184cfb5" Apr 17 20:18:46.005914 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.005870 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:18:46.135184 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135141 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135184 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135185 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-installation-pull-secrets\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135202 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-trusted-ca\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135222 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-image-registry-private-configuration\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135245 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-bound-sa-token\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135278 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d21b354-7081-4159-8dfa-77c0c4704b30-ca-trust-extracted\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135326 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-certificates\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135377 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5ltz\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-kube-api-access-h5ltz\") pod \"4d21b354-7081-4159-8dfa-77c0c4704b30\" (UID: \"4d21b354-7081-4159-8dfa-77c0c4704b30\") " Apr 17 20:18:46.135909 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135787 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:46.135909 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.135855 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:18:46.138050 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.138018 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:18:46.138175 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.138121 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:46.138175 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.138120 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:18:46.138266 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.138186 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:18:46.138421 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.138396 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-kube-api-access-h5ltz" (OuterVolumeSpecName: "kube-api-access-h5ltz") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "kube-api-access-h5ltz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:18:46.144312 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.144288 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d21b354-7081-4159-8dfa-77c0c4704b30-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4d21b354-7081-4159-8dfa-77c0c4704b30" (UID: "4d21b354-7081-4159-8dfa-77c0c4704b30"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:18:46.236265 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236212 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5ltz\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-kube-api-access-h5ltz\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236265 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236259 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-tls\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236265 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236269 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-installation-pull-secrets\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236265 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236278 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-trusted-ca\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236515 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236288 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4d21b354-7081-4159-8dfa-77c0c4704b30-image-registry-private-configuration\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236515 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236298 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d21b354-7081-4159-8dfa-77c0c4704b30-bound-sa-token\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236515 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236307 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d21b354-7081-4159-8dfa-77c0c4704b30-ca-trust-extracted\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.236515 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.236315 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d21b354-7081-4159-8dfa-77c0c4704b30-registry-certificates\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:18:46.999542 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:46.999509 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779c7964f8-9mdv7" Apr 17 20:18:47.018844 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:47.018816 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779c7964f8-9mdv7"] Apr 17 20:18:47.024485 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:47.024455 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-779c7964f8-9mdv7"] Apr 17 20:18:47.340632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:47.340599 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" path="/var/lib/kubelet/pods/4d21b354-7081-4159-8dfa-77c0c4704b30/volumes" Apr 17 20:18:51.054111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:51.054073 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" podUID="8e718c17-e639-4044-b14e-a421084115db" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:18:51.054605 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:51.054151 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" Apr 17 20:18:51.054778 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:51.054744 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"c01b262f08664cd6a6b4082f7a8d5f612c26c58ed8605df5807d1c2ce495b1bf"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 20:18:51.054852 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:51.054806 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" podUID="8e718c17-e639-4044-b14e-a421084115db" containerName="service-proxy" containerID="cri-o://c01b262f08664cd6a6b4082f7a8d5f612c26c58ed8605df5807d1c2ce495b1bf" gracePeriod=30 Apr 17 20:18:52.015283 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:52.015247 2571 generic.go:358] "Generic (PLEG): container finished" podID="8e718c17-e639-4044-b14e-a421084115db" containerID="c01b262f08664cd6a6b4082f7a8d5f612c26c58ed8605df5807d1c2ce495b1bf" exitCode=2 Apr 17 20:18:52.015531 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:52.015313 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" event={"ID":"8e718c17-e639-4044-b14e-a421084115db","Type":"ContainerDied","Data":"c01b262f08664cd6a6b4082f7a8d5f612c26c58ed8605df5807d1c2ce495b1bf"} Apr 17 20:18:52.015531 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:18:52.015351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64cb777964-fxh4m" event={"ID":"8e718c17-e639-4044-b14e-a421084115db","Type":"ContainerStarted","Data":"d1d3f7e1220dd53e04ffa42c12134720b44ad3aacbbf20f14cecb1fd2171ebc6"} Apr 17 20:19:19.190494 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:19.189404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:19:19.192196 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:19.192166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85885202-a740-4319-827b-236ded2de085-metrics-certs\") pod \"network-metrics-daemon-pwx7f\" (UID: \"85885202-a740-4319-827b-236ded2de085\") " pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:19:19.438017 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:19.437977 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rfhz5\"" Apr 17 20:19:19.446970 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:19.446894 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pwx7f" Apr 17 20:19:19.567620 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:19.567563 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pwx7f"] Apr 17 20:19:19.571334 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:19:19.571307 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85885202_a740_4319_827b_236ded2de085.slice/crio-3c96a29be581e61676298e03507005b2ae7c7813b389bf6ce5304f4d314e8bf8 WatchSource:0}: Error finding container 3c96a29be581e61676298e03507005b2ae7c7813b389bf6ce5304f4d314e8bf8: Status 404 returned error can't find the container with id 3c96a29be581e61676298e03507005b2ae7c7813b389bf6ce5304f4d314e8bf8 Apr 17 20:19:20.096499 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:20.096463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pwx7f" event={"ID":"85885202-a740-4319-827b-236ded2de085","Type":"ContainerStarted","Data":"3c96a29be581e61676298e03507005b2ae7c7813b389bf6ce5304f4d314e8bf8"} Apr 17 20:19:21.101016 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:21.100965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pwx7f" event={"ID":"85885202-a740-4319-827b-236ded2de085","Type":"ContainerStarted","Data":"d598ee39f02b2ee750831815f9d059de61400789458f1d5629f3babc28e2766b"} Apr 17 20:19:21.101016 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:21.101014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pwx7f" event={"ID":"85885202-a740-4319-827b-236ded2de085","Type":"ContainerStarted","Data":"a5a844754fecd4364dde26469783ddba2dfe93880de9a909e92de1c4b2325547"} Apr 17 20:19:21.118039 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:19:21.117983 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pwx7f" podStartSLOduration=252.991599351 podStartE2EDuration="4m14.117964559s" podCreationTimestamp="2026-04-17 20:15:07 +0000 UTC" firstStartedPulling="2026-04-17 20:19:19.573047726 +0000 UTC m=+252.744747798" lastFinishedPulling="2026-04-17 20:19:20.699412937 +0000 UTC m=+253.871113006" observedRunningTime="2026-04-17 20:19:21.117084285 +0000 UTC m=+254.288784366" watchObservedRunningTime="2026-04-17 20:19:21.117964559 +0000 UTC m=+254.289664641" Apr 17 20:20:07.249462 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:07.249434 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:20:32.743719 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.743686 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-b4c9k"] Apr 17 20:20:32.746075 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.743937 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" containerName="registry" Apr 17 20:20:32.746075 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.743950 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" containerName="registry" Apr 17 20:20:32.746075 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.743996 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d21b354-7081-4159-8dfa-77c0c4704b30" containerName="registry" Apr 17 20:20:32.746945 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.746927 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:32.749065 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.749037 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:20:32.749193 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.749041 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:20:32.749671 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.749657 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-wp92c\"" Apr 17 20:20:32.755019 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.754995 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-b4c9k"] Apr 17 20:20:32.812582 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.812547 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f27lc\" (UniqueName: \"kubernetes.io/projected/32687fd1-fed7-43ff-a8c9-4caff6d724ff-kube-api-access-f27lc\") pod \"cert-manager-cainjector-8966b78d4-b4c9k\" (UID: \"32687fd1-fed7-43ff-a8c9-4caff6d724ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:32.812582 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.812595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32687fd1-fed7-43ff-a8c9-4caff6d724ff-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-b4c9k\" (UID: \"32687fd1-fed7-43ff-a8c9-4caff6d724ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:32.913852 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.913809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f27lc\" (UniqueName: \"kubernetes.io/projected/32687fd1-fed7-43ff-a8c9-4caff6d724ff-kube-api-access-f27lc\") pod \"cert-manager-cainjector-8966b78d4-b4c9k\" (UID: \"32687fd1-fed7-43ff-a8c9-4caff6d724ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:32.913852 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.913861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32687fd1-fed7-43ff-a8c9-4caff6d724ff-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-b4c9k\" (UID: \"32687fd1-fed7-43ff-a8c9-4caff6d724ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:32.921791 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.921767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32687fd1-fed7-43ff-a8c9-4caff6d724ff-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-b4c9k\" (UID: \"32687fd1-fed7-43ff-a8c9-4caff6d724ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:32.921959 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:32.921938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f27lc\" (UniqueName: \"kubernetes.io/projected/32687fd1-fed7-43ff-a8c9-4caff6d724ff-kube-api-access-f27lc\") pod \"cert-manager-cainjector-8966b78d4-b4c9k\" (UID: \"32687fd1-fed7-43ff-a8c9-4caff6d724ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:33.056312 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:33.056268 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" Apr 17 20:20:33.174930 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:33.174873 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-b4c9k"] Apr 17 20:20:33.178212 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:20:33.178178 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32687fd1_fed7_43ff_a8c9_4caff6d724ff.slice/crio-904dd67314cc78e6c346bf565ef2c6b87eb6470cdd55dc55f0f3b19fea1d2a01 WatchSource:0}: Error finding container 904dd67314cc78e6c346bf565ef2c6b87eb6470cdd55dc55f0f3b19fea1d2a01: Status 404 returned error can't find the container with id 904dd67314cc78e6c346bf565ef2c6b87eb6470cdd55dc55f0f3b19fea1d2a01 Apr 17 20:20:33.180093 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:33.180075 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:20:33.285005 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:33.284970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" event={"ID":"32687fd1-fed7-43ff-a8c9-4caff6d724ff","Type":"ContainerStarted","Data":"904dd67314cc78e6c346bf565ef2c6b87eb6470cdd55dc55f0f3b19fea1d2a01"} Apr 17 20:20:38.300002 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:38.299959 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" event={"ID":"32687fd1-fed7-43ff-a8c9-4caff6d724ff","Type":"ContainerStarted","Data":"14c81eaaf9ee294dfecd36b5c5422269f5aa0089540dd2269e5ef12b2aa79e42"} Apr 17 20:20:38.314689 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:38.314640 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-b4c9k" podStartSLOduration=1.768953226 podStartE2EDuration="6.314625355s" podCreationTimestamp="2026-04-17 20:20:32 +0000 UTC" firstStartedPulling="2026-04-17 20:20:33.180259014 +0000 UTC m=+326.351959080" lastFinishedPulling="2026-04-17 20:20:37.725931146 +0000 UTC m=+330.897631209" observedRunningTime="2026-04-17 20:20:38.312767326 +0000 UTC m=+331.484467407" watchObservedRunningTime="2026-04-17 20:20:38.314625355 +0000 UTC m=+331.486325436" Apr 17 20:20:53.303079 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.303039 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g"] Apr 17 20:20:53.306130 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.306113 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.308262 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.308237 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:20:53.308979 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.308963 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:20:53.309055 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.308985 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-gg6q9\"" Apr 17 20:20:53.313021 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.312996 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g"] Apr 17 20:20:53.474309 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.474268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3391230-98fc-4d37-8cc0-4986d17eb994-tmp\") pod \"openshift-lws-operator-bfc7f696d-9q62g\" (UID: \"a3391230-98fc-4d37-8cc0-4986d17eb994\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.474490 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.474324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfkvf\" (UniqueName: \"kubernetes.io/projected/a3391230-98fc-4d37-8cc0-4986d17eb994-kube-api-access-xfkvf\") pod \"openshift-lws-operator-bfc7f696d-9q62g\" (UID: \"a3391230-98fc-4d37-8cc0-4986d17eb994\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.574940 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.574812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfkvf\" (UniqueName: \"kubernetes.io/projected/a3391230-98fc-4d37-8cc0-4986d17eb994-kube-api-access-xfkvf\") pod \"openshift-lws-operator-bfc7f696d-9q62g\" (UID: \"a3391230-98fc-4d37-8cc0-4986d17eb994\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.574940 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.574901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3391230-98fc-4d37-8cc0-4986d17eb994-tmp\") pod \"openshift-lws-operator-bfc7f696d-9q62g\" (UID: \"a3391230-98fc-4d37-8cc0-4986d17eb994\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.575237 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.575218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3391230-98fc-4d37-8cc0-4986d17eb994-tmp\") pod \"openshift-lws-operator-bfc7f696d-9q62g\" (UID: \"a3391230-98fc-4d37-8cc0-4986d17eb994\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.582298 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.582273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfkvf\" (UniqueName: \"kubernetes.io/projected/a3391230-98fc-4d37-8cc0-4986d17eb994-kube-api-access-xfkvf\") pod \"openshift-lws-operator-bfc7f696d-9q62g\" (UID: \"a3391230-98fc-4d37-8cc0-4986d17eb994\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.615289 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.615213 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" Apr 17 20:20:53.732030 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:53.731894 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g"] Apr 17 20:20:53.734481 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:20:53.734446 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3391230_98fc_4d37_8cc0_4986d17eb994.slice/crio-96a3ed23f11b0c1fcb24200d6f9a483a32873624fcbbd1488df67172d6593f71 WatchSource:0}: Error finding container 96a3ed23f11b0c1fcb24200d6f9a483a32873624fcbbd1488df67172d6593f71: Status 404 returned error can't find the container with id 96a3ed23f11b0c1fcb24200d6f9a483a32873624fcbbd1488df67172d6593f71 Apr 17 20:20:54.342327 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:54.342289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" event={"ID":"a3391230-98fc-4d37-8cc0-4986d17eb994","Type":"ContainerStarted","Data":"96a3ed23f11b0c1fcb24200d6f9a483a32873624fcbbd1488df67172d6593f71"} Apr 17 20:20:57.352412 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:57.352376 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" event={"ID":"a3391230-98fc-4d37-8cc0-4986d17eb994","Type":"ContainerStarted","Data":"48d5d902fb867ff3fc40182f2a0a9246b8985e370ecfbe8a0de96c57b7fa39ce"} Apr 17 20:20:57.366618 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:20:57.366571 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-9q62g" podStartSLOduration=0.968904337 podStartE2EDuration="4.366555345s" podCreationTimestamp="2026-04-17 20:20:53 +0000 UTC" firstStartedPulling="2026-04-17 20:20:53.735991561 +0000 UTC m=+346.907691621" lastFinishedPulling="2026-04-17 20:20:57.133642377 +0000 UTC m=+350.305342629" observedRunningTime="2026-04-17 20:20:57.366006776 +0000 UTC m=+350.537706859" watchObservedRunningTime="2026-04-17 20:20:57.366555345 +0000 UTC m=+350.538255426" Apr 17 20:21:16.509819 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.509778 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk"] Apr 17 20:21:16.518449 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.518417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.521145 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.521122 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:21:16.521145 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.521119 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:21:16.521337 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.521150 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:21:16.521337 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.521225 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:21:16.521337 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.521240 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5wc67\"" Apr 17 20:21:16.526095 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.526071 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk"] Apr 17 20:21:16.537610 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.537587 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/038276ad-1ad9-48e1-9463-f2805ad83aba-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.537734 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.537626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrzkg\" (UniqueName: \"kubernetes.io/projected/038276ad-1ad9-48e1-9463-f2805ad83aba-kube-api-access-hrzkg\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.537734 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.537650 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/038276ad-1ad9-48e1-9463-f2805ad83aba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.638765 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.638717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/038276ad-1ad9-48e1-9463-f2805ad83aba-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.638765 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.638773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrzkg\" (UniqueName: \"kubernetes.io/projected/038276ad-1ad9-48e1-9463-f2805ad83aba-kube-api-access-hrzkg\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.639037 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.638797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/038276ad-1ad9-48e1-9463-f2805ad83aba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.641418 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.641393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/038276ad-1ad9-48e1-9463-f2805ad83aba-webhook-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.641538 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.641460 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/038276ad-1ad9-48e1-9463-f2805ad83aba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.647084 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.647059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrzkg\" (UniqueName: \"kubernetes.io/projected/038276ad-1ad9-48e1-9463-f2805ad83aba-kube-api-access-hrzkg\") pod \"opendatahub-operator-controller-manager-799c8bc7d9-5x2bk\" (UID: \"038276ad-1ad9-48e1-9463-f2805ad83aba\") " pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.829248 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.829205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:16.950372 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:16.950182 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk"] Apr 17 20:21:16.953940 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:21:16.953911 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod038276ad_1ad9_48e1_9463_f2805ad83aba.slice/crio-fd0cff88b31f8e5ab97805e6261025bf1b17d3fdc485260977375dc573d1fd81 WatchSource:0}: Error finding container fd0cff88b31f8e5ab97805e6261025bf1b17d3fdc485260977375dc573d1fd81: Status 404 returned error can't find the container with id fd0cff88b31f8e5ab97805e6261025bf1b17d3fdc485260977375dc573d1fd81 Apr 17 20:21:17.407487 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:17.407435 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" event={"ID":"038276ad-1ad9-48e1-9463-f2805ad83aba","Type":"ContainerStarted","Data":"fd0cff88b31f8e5ab97805e6261025bf1b17d3fdc485260977375dc573d1fd81"} Apr 17 20:21:20.417340 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:20.417300 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" event={"ID":"038276ad-1ad9-48e1-9463-f2805ad83aba","Type":"ContainerStarted","Data":"2064b2a2e36e7dd34b572995bcc2d97a698330dbbf8ab479299a2340e7d7c1c1"} Apr 17 20:21:20.417752 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:20.417405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:20.443526 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:20.443469 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" podStartSLOduration=2.001845856 podStartE2EDuration="4.443450897s" podCreationTimestamp="2026-04-17 20:21:16 +0000 UTC" firstStartedPulling="2026-04-17 20:21:16.955516247 +0000 UTC m=+370.127216310" lastFinishedPulling="2026-04-17 20:21:19.397121288 +0000 UTC m=+372.568821351" observedRunningTime="2026-04-17 20:21:20.441408379 +0000 UTC m=+373.613108482" watchObservedRunningTime="2026-04-17 20:21:20.443450897 +0000 UTC m=+373.615150977" Apr 17 20:21:31.423383 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:31.423352 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-799c8bc7d9-5x2bk" Apr 17 20:21:34.742055 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.742005 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b"] Apr 17 20:21:34.748807 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.748779 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.751331 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.751307 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:21:34.751482 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.751307 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-7h6rh\"" Apr 17 20:21:34.752101 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.752085 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:21:34.752181 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.752152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:21:34.752241 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.752183 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:21:34.754410 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.754378 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b"] Apr 17 20:21:34.867271 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.867231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb2e5b1d-3bc7-4289-be66-337299e352eb-tmp\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.867456 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.867286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2e5b1d-3bc7-4289-be66-337299e352eb-tls-certs\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.867456 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.867341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5vxd\" (UniqueName: \"kubernetes.io/projected/bb2e5b1d-3bc7-4289-be66-337299e352eb-kube-api-access-w5vxd\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.968632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.968591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb2e5b1d-3bc7-4289-be66-337299e352eb-tmp\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.968807 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.968643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2e5b1d-3bc7-4289-be66-337299e352eb-tls-certs\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.968807 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.968664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5vxd\" (UniqueName: \"kubernetes.io/projected/bb2e5b1d-3bc7-4289-be66-337299e352eb-kube-api-access-w5vxd\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.971136 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.971107 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb2e5b1d-3bc7-4289-be66-337299e352eb-tmp\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.971351 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.971328 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2e5b1d-3bc7-4289-be66-337299e352eb-tls-certs\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:34.975679 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:34.975658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5vxd\" (UniqueName: \"kubernetes.io/projected/bb2e5b1d-3bc7-4289-be66-337299e352eb-kube-api-access-w5vxd\") pod \"kube-auth-proxy-686dfbc75c-zfx2b\" (UID: \"bb2e5b1d-3bc7-4289-be66-337299e352eb\") " pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:35.058985 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:35.058937 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" Apr 17 20:21:35.181371 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:35.181330 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b"] Apr 17 20:21:35.185804 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:21:35.185773 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb2e5b1d_3bc7_4289_be66_337299e352eb.slice/crio-b27e1b4fb9c48e89f78b9c7cde755c336a56defeba89d1ccd14b41b95b0db496 WatchSource:0}: Error finding container b27e1b4fb9c48e89f78b9c7cde755c336a56defeba89d1ccd14b41b95b0db496: Status 404 returned error can't find the container with id b27e1b4fb9c48e89f78b9c7cde755c336a56defeba89d1ccd14b41b95b0db496 Apr 17 20:21:35.460996 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:35.460911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" event={"ID":"bb2e5b1d-3bc7-4289-be66-337299e352eb","Type":"ContainerStarted","Data":"b27e1b4fb9c48e89f78b9c7cde755c336a56defeba89d1ccd14b41b95b0db496"} Apr 17 20:21:37.039332 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.039288 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6phsz"] Apr 17 20:21:37.048825 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.048796 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.051328 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.051051 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 20:21:37.051328 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.051296 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-qprwb\"" Apr 17 20:21:37.052333 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.052310 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6phsz"] Apr 17 20:21:37.086540 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.086496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-cert\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.086730 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.086580 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986zs\" (UniqueName: \"kubernetes.io/projected/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-kube-api-access-986zs\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.188513 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.187979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-cert\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.188513 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.188062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-986zs\" (UniqueName: \"kubernetes.io/projected/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-kube-api-access-986zs\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.188513 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:37.188288 2571 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 20:21:37.188513 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:37.188353 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-cert podName:de5bc5da-6f25-4bfc-95fb-1d3be17d3f11 nodeName:}" failed. No retries permitted until 2026-04-17 20:21:37.688332804 +0000 UTC m=+390.860032869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-cert") pod "odh-model-controller-858dbf95b8-6phsz" (UID: "de5bc5da-6f25-4bfc-95fb-1d3be17d3f11") : secret "odh-model-controller-webhook-cert" not found Apr 17 20:21:37.197725 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.197699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-986zs\" (UniqueName: \"kubernetes.io/projected/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-kube-api-access-986zs\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.691667 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.691630 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-cert\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.694450 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.694422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de5bc5da-6f25-4bfc-95fb-1d3be17d3f11-cert\") pod \"odh-model-controller-858dbf95b8-6phsz\" (UID: \"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11\") " pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:37.961648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:37.961541 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:38.086221 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:38.086188 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6phsz"] Apr 17 20:21:38.090127 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:21:38.090089 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5bc5da_6f25_4bfc_95fb_1d3be17d3f11.slice/crio-6b59f398c9f222c52ef3efb9f8acaa5069c47565aab841b1b65db30b5303094f WatchSource:0}: Error finding container 6b59f398c9f222c52ef3efb9f8acaa5069c47565aab841b1b65db30b5303094f: Status 404 returned error can't find the container with id 6b59f398c9f222c52ef3efb9f8acaa5069c47565aab841b1b65db30b5303094f Apr 17 20:21:38.470554 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:38.470489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" event={"ID":"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11","Type":"ContainerStarted","Data":"6b59f398c9f222c52ef3efb9f8acaa5069c47565aab841b1b65db30b5303094f"} Apr 17 20:21:39.475804 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:39.475742 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" event={"ID":"bb2e5b1d-3bc7-4289-be66-337299e352eb","Type":"ContainerStarted","Data":"e942883b55b0927704821da254d5ea4d6c50a1a4df47ecfcbf805c12df18f465"} Apr 17 20:21:39.491711 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:39.491641 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-686dfbc75c-zfx2b" podStartSLOduration=1.7865980879999999 podStartE2EDuration="5.491620113s" podCreationTimestamp="2026-04-17 20:21:34 +0000 UTC" firstStartedPulling="2026-04-17 20:21:35.188068325 +0000 UTC m=+388.359768395" lastFinishedPulling="2026-04-17 20:21:38.89309035 +0000 UTC m=+392.064790420" observedRunningTime="2026-04-17 20:21:39.490367946 +0000 UTC m=+392.662068045" watchObservedRunningTime="2026-04-17 20:21:39.491620113 +0000 UTC m=+392.663320195" Apr 17 20:21:41.483841 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:41.483797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" event={"ID":"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11","Type":"ContainerStarted","Data":"dc32b2790462fbcb1dee095ae4d22ea3c3f2c30b929c984b1412375b0ff65700"} Apr 17 20:21:41.484319 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:41.484055 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:41.497922 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:41.497834 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" podStartSLOduration=1.26613306 podStartE2EDuration="4.497808966s" podCreationTimestamp="2026-04-17 20:21:37 +0000 UTC" firstStartedPulling="2026-04-17 20:21:38.092047764 +0000 UTC m=+391.263747837" lastFinishedPulling="2026-04-17 20:21:41.32372367 +0000 UTC m=+394.495423743" observedRunningTime="2026-04-17 20:21:41.497618036 +0000 UTC m=+394.669318130" watchObservedRunningTime="2026-04-17 20:21:41.497808966 +0000 UTC m=+394.669509049" Apr 17 20:21:42.488604 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:42.488568 2571 generic.go:358] "Generic (PLEG): container finished" podID="de5bc5da-6f25-4bfc-95fb-1d3be17d3f11" containerID="dc32b2790462fbcb1dee095ae4d22ea3c3f2c30b929c984b1412375b0ff65700" exitCode=1 Apr 17 20:21:42.489076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:42.488661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" event={"ID":"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11","Type":"ContainerDied","Data":"dc32b2790462fbcb1dee095ae4d22ea3c3f2c30b929c984b1412375b0ff65700"} Apr 17 20:21:42.489076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:42.488946 2571 scope.go:117] "RemoveContainer" containerID="dc32b2790462fbcb1dee095ae4d22ea3c3f2c30b929c984b1412375b0ff65700" Apr 17 20:21:43.433277 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.433244 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2dkmq"] Apr 17 20:21:43.436158 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.436139 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:43.438927 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.438892 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 20:21:43.439032 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.438986 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-zhqh6\"" Apr 17 20:21:43.463709 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.463674 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2dkmq"] Apr 17 20:21:43.493632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.493596 2571 generic.go:358] "Generic (PLEG): container finished" podID="de5bc5da-6f25-4bfc-95fb-1d3be17d3f11" containerID="ec812a9dc494d215dfd236e481207442ca3c1b74d438a504ee5a862376f09a2c" exitCode=1 Apr 17 20:21:43.494110 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.493682 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" event={"ID":"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11","Type":"ContainerDied","Data":"ec812a9dc494d215dfd236e481207442ca3c1b74d438a504ee5a862376f09a2c"} Apr 17 20:21:43.494110 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.493727 2571 scope.go:117] "RemoveContainer" containerID="dc32b2790462fbcb1dee095ae4d22ea3c3f2c30b929c984b1412375b0ff65700" Apr 17 20:21:43.494110 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.493964 2571 scope.go:117] "RemoveContainer" containerID="ec812a9dc494d215dfd236e481207442ca3c1b74d438a504ee5a862376f09a2c" Apr 17 20:21:43.494290 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:43.494211 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6phsz_opendatahub(de5bc5da-6f25-4bfc-95fb-1d3be17d3f11)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" podUID="de5bc5da-6f25-4bfc-95fb-1d3be17d3f11" Apr 17 20:21:43.545347 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.545306 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7f7\" (UniqueName: \"kubernetes.io/projected/69f17798-07e0-48af-9329-7c19d18d34d6-kube-api-access-xf7f7\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:43.545522 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.545398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f17798-07e0-48af-9329-7c19d18d34d6-cert\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:43.646579 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.646542 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7f7\" (UniqueName: \"kubernetes.io/projected/69f17798-07e0-48af-9329-7c19d18d34d6-kube-api-access-xf7f7\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:43.646579 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.646580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f17798-07e0-48af-9329-7c19d18d34d6-cert\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:43.646802 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:43.646705 2571 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 20:21:43.646802 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:43.646757 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69f17798-07e0-48af-9329-7c19d18d34d6-cert podName:69f17798-07e0-48af-9329-7c19d18d34d6 nodeName:}" failed. No retries permitted until 2026-04-17 20:21:44.146740783 +0000 UTC m=+397.318440842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/69f17798-07e0-48af-9329-7c19d18d34d6-cert") pod "kserve-controller-manager-856948b99f-2dkmq" (UID: "69f17798-07e0-48af-9329-7c19d18d34d6") : secret "kserve-webhook-server-cert" not found Apr 17 20:21:43.657318 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:43.657285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7f7\" (UniqueName: \"kubernetes.io/projected/69f17798-07e0-48af-9329-7c19d18d34d6-kube-api-access-xf7f7\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:44.150066 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:44.150018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f17798-07e0-48af-9329-7c19d18d34d6-cert\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:44.152624 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:44.152600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69f17798-07e0-48af-9329-7c19d18d34d6-cert\") pod \"kserve-controller-manager-856948b99f-2dkmq\" (UID: \"69f17798-07e0-48af-9329-7c19d18d34d6\") " pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:44.347930 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:44.347859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:44.473002 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:44.472965 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2dkmq"] Apr 17 20:21:44.476038 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:21:44.476002 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f17798_07e0_48af_9329_7c19d18d34d6.slice/crio-bb53c1a1dc11e333258b9b669d6e6c4b9859097df7f560866a87aa99b6e70085 WatchSource:0}: Error finding container bb53c1a1dc11e333258b9b669d6e6c4b9859097df7f560866a87aa99b6e70085: Status 404 returned error can't find the container with id bb53c1a1dc11e333258b9b669d6e6c4b9859097df7f560866a87aa99b6e70085 Apr 17 20:21:44.497999 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:44.497958 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" event={"ID":"69f17798-07e0-48af-9329-7c19d18d34d6","Type":"ContainerStarted","Data":"bb53c1a1dc11e333258b9b669d6e6c4b9859097df7f560866a87aa99b6e70085"} Apr 17 20:21:44.499618 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:44.499598 2571 scope.go:117] "RemoveContainer" containerID="ec812a9dc494d215dfd236e481207442ca3c1b74d438a504ee5a862376f09a2c" Apr 17 20:21:44.499793 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:44.499776 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6phsz_opendatahub(de5bc5da-6f25-4bfc-95fb-1d3be17d3f11)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" podUID="de5bc5da-6f25-4bfc-95fb-1d3be17d3f11" Apr 17 20:21:47.510937 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:47.510815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" event={"ID":"69f17798-07e0-48af-9329-7c19d18d34d6","Type":"ContainerStarted","Data":"87457c11334b073b05ffed6c9c0613e32e1f32f94f89fdaef5ae213e68721ff6"} Apr 17 20:21:47.510937 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:47.510871 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:21:47.539200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:47.539135 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" podStartSLOduration=1.987884764 podStartE2EDuration="4.539100165s" podCreationTimestamp="2026-04-17 20:21:43 +0000 UTC" firstStartedPulling="2026-04-17 20:21:44.477541128 +0000 UTC m=+397.649241189" lastFinishedPulling="2026-04-17 20:21:47.028756525 +0000 UTC m=+400.200456590" observedRunningTime="2026-04-17 20:21:47.538169696 +0000 UTC m=+400.709869776" watchObservedRunningTime="2026-04-17 20:21:47.539100165 +0000 UTC m=+400.710800227" Apr 17 20:21:48.859003 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.858967 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bd578"] Apr 17 20:21:48.862174 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.862153 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:48.864474 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.864451 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 20:21:48.864608 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.864496 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 20:21:48.864608 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.864496 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-r52bf\"" Apr 17 20:21:48.874905 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.874852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bd578"] Apr 17 20:21:48.989358 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.989317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5123928d-caef-4b04-8507-ee02ffff81ed-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bd578\" (UID: \"5123928d-caef-4b04-8507-ee02ffff81ed\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:48.989358 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:48.989364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sg8v\" (UniqueName: \"kubernetes.io/projected/5123928d-caef-4b04-8507-ee02ffff81ed-kube-api-access-4sg8v\") pod \"servicemesh-operator3-55f49c5f94-bd578\" (UID: \"5123928d-caef-4b04-8507-ee02ffff81ed\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:49.089745 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.089706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5123928d-caef-4b04-8507-ee02ffff81ed-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bd578\" (UID: \"5123928d-caef-4b04-8507-ee02ffff81ed\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:49.089875 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.089761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sg8v\" (UniqueName: \"kubernetes.io/projected/5123928d-caef-4b04-8507-ee02ffff81ed-kube-api-access-4sg8v\") pod \"servicemesh-operator3-55f49c5f94-bd578\" (UID: \"5123928d-caef-4b04-8507-ee02ffff81ed\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:49.092744 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.092712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5123928d-caef-4b04-8507-ee02ffff81ed-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bd578\" (UID: \"5123928d-caef-4b04-8507-ee02ffff81ed\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:49.098381 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.098338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sg8v\" (UniqueName: \"kubernetes.io/projected/5123928d-caef-4b04-8507-ee02ffff81ed-kube-api-access-4sg8v\") pod \"servicemesh-operator3-55f49c5f94-bd578\" (UID: \"5123928d-caef-4b04-8507-ee02ffff81ed\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:49.171496 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.171406 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:49.307091 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.307054 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bd578"] Apr 17 20:21:49.311651 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:21:49.311619 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5123928d_caef_4b04_8507_ee02ffff81ed.slice/crio-4e2d1910fe19e8c9bf729d2618103a52e782bf139a65e324e26c294ef43daca5 WatchSource:0}: Error finding container 4e2d1910fe19e8c9bf729d2618103a52e782bf139a65e324e26c294ef43daca5: Status 404 returned error can't find the container with id 4e2d1910fe19e8c9bf729d2618103a52e782bf139a65e324e26c294ef43daca5 Apr 17 20:21:49.518261 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:49.518172 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" event={"ID":"5123928d-caef-4b04-8507-ee02ffff81ed","Type":"ContainerStarted","Data":"4e2d1910fe19e8c9bf729d2618103a52e782bf139a65e324e26c294ef43daca5"} Apr 17 20:21:51.484521 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:51.484476 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:51.485009 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:51.484983 2571 scope.go:117] "RemoveContainer" containerID="ec812a9dc494d215dfd236e481207442ca3c1b74d438a504ee5a862376f09a2c" Apr 17 20:21:51.485238 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:21:51.485209 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6phsz_opendatahub(de5bc5da-6f25-4bfc-95fb-1d3be17d3f11)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" podUID="de5bc5da-6f25-4bfc-95fb-1d3be17d3f11" Apr 17 20:21:52.530352 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:52.530317 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" event={"ID":"5123928d-caef-4b04-8507-ee02ffff81ed","Type":"ContainerStarted","Data":"a6f817018e5ef33ef71def84275dd3e8227f69f634327b339678307946c45baf"} Apr 17 20:21:52.530855 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:52.530438 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:21:52.564255 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:52.564203 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" podStartSLOduration=1.7015947649999998 podStartE2EDuration="4.564186908s" podCreationTimestamp="2026-04-17 20:21:48 +0000 UTC" firstStartedPulling="2026-04-17 20:21:49.314940953 +0000 UTC m=+402.486641012" lastFinishedPulling="2026-04-17 20:21:52.177533096 +0000 UTC m=+405.349233155" observedRunningTime="2026-04-17 20:21:52.561567914 +0000 UTC m=+405.733267994" watchObservedRunningTime="2026-04-17 20:21:52.564186908 +0000 UTC m=+405.735886988" Apr 17 20:21:53.853639 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.853602 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv"] Apr 17 20:21:53.863097 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.863070 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.865329 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.865299 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:21:53.865485 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.865426 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 20:21:53.865789 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.865764 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 20:21:53.866645 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.866416 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-5662f\"" Apr 17 20:21:53.866645 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.866484 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 20:21:53.868041 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.868020 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv"] Apr 17 20:21:53.930369 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.930550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.930550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a7d42755-1cef-4602-a268-c0c0152aed8b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.930550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930427 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.930550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930464 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2swgp\" (UniqueName: \"kubernetes.io/projected/a7d42755-1cef-4602-a268-c0c0152aed8b-kube-api-access-2swgp\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.930550 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:53.930710 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:53.930571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.031435 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.031648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.031648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.031648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.031648 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a7d42755-1cef-4602-a268-c0c0152aed8b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.031894 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.032103 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.031865 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2swgp\" (UniqueName: \"kubernetes.io/projected/a7d42755-1cef-4602-a268-c0c0152aed8b-kube-api-access-2swgp\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.032236 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.032202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.034412 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.034377 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.034544 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.034529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a7d42755-1cef-4602-a268-c0c0152aed8b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.034656 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.034637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.034734 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.034714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.038546 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.038519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a7d42755-1cef-4602-a268-c0c0152aed8b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.038866 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.038846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2swgp\" (UniqueName: \"kubernetes.io/projected/a7d42755-1cef-4602-a268-c0c0152aed8b-kube-api-access-2swgp\") pod \"istiod-openshift-gateway-55ff986f96-mf7bv\" (UID: \"a7d42755-1cef-4602-a268-c0c0152aed8b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.174538 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.174438 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:54.311973 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.311941 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv"] Apr 17 20:21:54.314454 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:21:54.314423 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d42755_1cef_4602_a268_c0c0152aed8b.slice/crio-b5e90e2a9e62bf2aa8863b57c059d38730230a3ac9ae47ffa718bc1045038cee WatchSource:0}: Error finding container b5e90e2a9e62bf2aa8863b57c059d38730230a3ac9ae47ffa718bc1045038cee: Status 404 returned error can't find the container with id b5e90e2a9e62bf2aa8863b57c059d38730230a3ac9ae47ffa718bc1045038cee Apr 17 20:21:54.537395 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:54.537358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" event={"ID":"a7d42755-1cef-4602-a268-c0c0152aed8b","Type":"ContainerStarted","Data":"b5e90e2a9e62bf2aa8863b57c059d38730230a3ac9ae47ffa718bc1045038cee"} Apr 17 20:21:56.786857 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:56.786812 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:21:56.787141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:56.786918 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 17 20:21:57.550657 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.550615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" event={"ID":"a7d42755-1cef-4602-a268-c0c0152aed8b","Type":"ContainerStarted","Data":"e4a486cb696e9d2d5be5c47f8ecf7b3b8f7bb22a32825251287b15d5d034933a"} Apr 17 20:21:57.550946 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.550901 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:21:57.552785 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.552724 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-mf7bv container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 20:21:57.552960 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.552902 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" podUID="a7d42755-1cef-4602-a268-c0c0152aed8b" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:21:57.572154 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.572092 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" podStartSLOduration=2.101988201 podStartE2EDuration="4.572074293s" podCreationTimestamp="2026-04-17 20:21:53 +0000 UTC" firstStartedPulling="2026-04-17 20:21:54.316468527 +0000 UTC m=+407.488168588" lastFinishedPulling="2026-04-17 20:21:56.786554621 +0000 UTC m=+409.958254680" observedRunningTime="2026-04-17 20:21:57.57190904 +0000 UTC m=+410.743609136" watchObservedRunningTime="2026-04-17 20:21:57.572074293 +0000 UTC m=+410.743774375" Apr 17 20:21:57.962204 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.962118 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:57.962635 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:57.962612 2571 scope.go:117] "RemoveContainer" containerID="ec812a9dc494d215dfd236e481207442ca3c1b74d438a504ee5a862376f09a2c" Apr 17 20:21:58.555689 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:58.555650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" event={"ID":"de5bc5da-6f25-4bfc-95fb-1d3be17d3f11","Type":"ContainerStarted","Data":"3b0cbb8f116fdffced616d4e1218c32a5b1779eabaa163e48dd5393ef0a3c554"} Apr 17 20:21:58.555971 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:58.555919 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:21:58.557139 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:21:58.557115 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mf7bv" Apr 17 20:22:03.535908 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:22:03.535848 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bd578" Apr 17 20:22:09.561960 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:22:09.561926 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-6phsz" Apr 17 20:22:18.519692 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:22:18.519658 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-2dkmq" Apr 17 20:23:14.883394 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.883359 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-cd2h9"] Apr 17 20:23:14.886609 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.886588 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:14.889321 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.889297 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:23:14.890141 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.890104 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:23:14.890256 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.890239 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-pvkwn\"" Apr 17 20:23:14.901203 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.901177 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-cd2h9"] Apr 17 20:23:14.993195 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:14.993158 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cbp\" (UniqueName: \"kubernetes.io/projected/675a63a5-e8b3-4fe4-9200-0151facc0074-kube-api-access-r6cbp\") pod \"authorino-operator-657f44b778-cd2h9\" (UID: \"675a63a5-e8b3-4fe4-9200-0151facc0074\") " pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:15.094513 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:15.094471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cbp\" (UniqueName: \"kubernetes.io/projected/675a63a5-e8b3-4fe4-9200-0151facc0074-kube-api-access-r6cbp\") pod \"authorino-operator-657f44b778-cd2h9\" (UID: \"675a63a5-e8b3-4fe4-9200-0151facc0074\") " pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:15.103658 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:15.103631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cbp\" (UniqueName: \"kubernetes.io/projected/675a63a5-e8b3-4fe4-9200-0151facc0074-kube-api-access-r6cbp\") pod \"authorino-operator-657f44b778-cd2h9\" (UID: \"675a63a5-e8b3-4fe4-9200-0151facc0074\") " pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:15.197093 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:15.196991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:15.318782 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:15.318757 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-cd2h9"] Apr 17 20:23:15.321931 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:23:15.321903 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675a63a5_e8b3_4fe4_9200_0151facc0074.slice/crio-13ebde08beea0c2f7365e4391b0db95f0cf4180211c88d879cb738a8460a85d0 WatchSource:0}: Error finding container 13ebde08beea0c2f7365e4391b0db95f0cf4180211c88d879cb738a8460a85d0: Status 404 returned error can't find the container with id 13ebde08beea0c2f7365e4391b0db95f0cf4180211c88d879cb738a8460a85d0 Apr 17 20:23:15.796063 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:15.796026 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" event={"ID":"675a63a5-e8b3-4fe4-9200-0151facc0074","Type":"ContainerStarted","Data":"13ebde08beea0c2f7365e4391b0db95f0cf4180211c88d879cb738a8460a85d0"} Apr 17 20:23:17.803862 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:17.803818 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" event={"ID":"675a63a5-e8b3-4fe4-9200-0151facc0074","Type":"ContainerStarted","Data":"5a6ab653815ba84f1c7942a1af8368dc5d8d8e2a00636e12da747a2e27958d00"} Apr 17 20:23:17.804345 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:17.804000 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:17.821755 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:17.821704 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" podStartSLOduration=2.369702648 podStartE2EDuration="3.821687901s" podCreationTimestamp="2026-04-17 20:23:14 +0000 UTC" firstStartedPulling="2026-04-17 20:23:15.323739371 +0000 UTC m=+488.495439433" lastFinishedPulling="2026-04-17 20:23:16.775724623 +0000 UTC m=+489.947424686" observedRunningTime="2026-04-17 20:23:17.820571427 +0000 UTC m=+490.992271505" watchObservedRunningTime="2026-04-17 20:23:17.821687901 +0000 UTC m=+490.993387981" Apr 17 20:23:28.810179 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:28.810147 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-cd2h9" Apr 17 20:23:30.002059 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.002018 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f"] Apr 17 20:23:30.005435 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.005415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.007773 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.007741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-sl7bv\"" Apr 17 20:23:30.012079 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.011965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984pp\" (UniqueName: \"kubernetes.io/projected/bff9c9aa-45c5-41c6-a95a-641918b7bec0-kube-api-access-984pp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-8q97f\" (UID: \"bff9c9aa-45c5-41c6-a95a-641918b7bec0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.012162 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.012076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bff9c9aa-45c5-41c6-a95a-641918b7bec0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-8q97f\" (UID: \"bff9c9aa-45c5-41c6-a95a-641918b7bec0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.015507 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.015476 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f"] Apr 17 20:23:30.113382 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.113332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bff9c9aa-45c5-41c6-a95a-641918b7bec0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-8q97f\" (UID: \"bff9c9aa-45c5-41c6-a95a-641918b7bec0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.113382 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.113389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-984pp\" (UniqueName: \"kubernetes.io/projected/bff9c9aa-45c5-41c6-a95a-641918b7bec0-kube-api-access-984pp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-8q97f\" (UID: \"bff9c9aa-45c5-41c6-a95a-641918b7bec0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.113739 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.113717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bff9c9aa-45c5-41c6-a95a-641918b7bec0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-8q97f\" (UID: \"bff9c9aa-45c5-41c6-a95a-641918b7bec0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.123108 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.123076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-984pp\" (UniqueName: \"kubernetes.io/projected/bff9c9aa-45c5-41c6-a95a-641918b7bec0-kube-api-access-984pp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-8q97f\" (UID: \"bff9c9aa-45c5-41c6-a95a-641918b7bec0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.317022 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.316976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:30.443590 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.443566 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f"] Apr 17 20:23:30.446416 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:23:30.446387 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff9c9aa_45c5_41c6_a95a_641918b7bec0.slice/crio-6c6220278569fe34549d87a67f8ac1a420285a226e166c2c57b0a31e049d796d WatchSource:0}: Error finding container 6c6220278569fe34549d87a67f8ac1a420285a226e166c2c57b0a31e049d796d: Status 404 returned error can't find the container with id 6c6220278569fe34549d87a67f8ac1a420285a226e166c2c57b0a31e049d796d Apr 17 20:23:30.848770 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:30.848731 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" event={"ID":"bff9c9aa-45c5-41c6-a95a-641918b7bec0","Type":"ContainerStarted","Data":"6c6220278569fe34549d87a67f8ac1a420285a226e166c2c57b0a31e049d796d"} Apr 17 20:23:34.864900 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:34.864838 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" event={"ID":"bff9c9aa-45c5-41c6-a95a-641918b7bec0","Type":"ContainerStarted","Data":"ba5ebaf7283e3131abf8500d249acc459b87607ccf1a3d041c542b926d8af980"} Apr 17 20:23:34.865367 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:34.864930 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:23:34.884656 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:34.884306 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" podStartSLOduration=1.881683877 podStartE2EDuration="5.884288292s" podCreationTimestamp="2026-04-17 20:23:29 +0000 UTC" firstStartedPulling="2026-04-17 20:23:30.448669518 +0000 UTC m=+503.620369580" lastFinishedPulling="2026-04-17 20:23:34.451273937 +0000 UTC m=+507.622973995" observedRunningTime="2026-04-17 20:23:34.882651497 +0000 UTC m=+508.054351579" watchObservedRunningTime="2026-04-17 20:23:34.884288292 +0000 UTC m=+508.055988374" Apr 17 20:23:45.870723 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:23:45.870692 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-8q97f" Apr 17 20:24:07.266424 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:07.266382 2571 scope.go:117] "RemoveContainer" containerID="225aaf7957df714fe517be686d11c965bde020a2ba83eceb06abbb7f62c5ea1c" Apr 17 20:24:42.023303 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.023265 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-696d5c5dc7-4l9cv"] Apr 17 20:24:42.026536 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.026514 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:42.028745 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.028721 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-fwz42\"" Apr 17 20:24:42.033145 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.033122 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-696d5c5dc7-4l9cv"] Apr 17 20:24:42.089628 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.089592 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8qq\" (UniqueName: \"kubernetes.io/projected/57a6a19c-bec1-41d4-8b02-0243b61f5906-kube-api-access-7f8qq\") pod \"maas-controller-696d5c5dc7-4l9cv\" (UID: \"57a6a19c-bec1-41d4-8b02-0243b61f5906\") " pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:42.130868 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.130832 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-696d5c5dc7-4l9cv"] Apr 17 20:24:42.131095 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:24:42.131075 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7f8qq], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" podUID="57a6a19c-bec1-41d4-8b02-0243b61f5906" Apr 17 20:24:42.190585 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.190541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8qq\" (UniqueName: \"kubernetes.io/projected/57a6a19c-bec1-41d4-8b02-0243b61f5906-kube-api-access-7f8qq\") pod \"maas-controller-696d5c5dc7-4l9cv\" (UID: \"57a6a19c-bec1-41d4-8b02-0243b61f5906\") " pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:42.197823 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:42.197788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8qq\" (UniqueName: \"kubernetes.io/projected/57a6a19c-bec1-41d4-8b02-0243b61f5906-kube-api-access-7f8qq\") pod \"maas-controller-696d5c5dc7-4l9cv\" (UID: \"57a6a19c-bec1-41d4-8b02-0243b61f5906\") " pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:43.085367 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:43.085323 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:43.090129 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:43.090104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:43.096901 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:43.096865 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8qq\" (UniqueName: \"kubernetes.io/projected/57a6a19c-bec1-41d4-8b02-0243b61f5906-kube-api-access-7f8qq\") pod \"57a6a19c-bec1-41d4-8b02-0243b61f5906\" (UID: \"57a6a19c-bec1-41d4-8b02-0243b61f5906\") " Apr 17 20:24:43.099111 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:43.099088 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a6a19c-bec1-41d4-8b02-0243b61f5906-kube-api-access-7f8qq" (OuterVolumeSpecName: "kube-api-access-7f8qq") pod "57a6a19c-bec1-41d4-8b02-0243b61f5906" (UID: "57a6a19c-bec1-41d4-8b02-0243b61f5906"). InnerVolumeSpecName "kube-api-access-7f8qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:24:43.197463 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:43.197423 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7f8qq\" (UniqueName: \"kubernetes.io/projected/57a6a19c-bec1-41d4-8b02-0243b61f5906-kube-api-access-7f8qq\") on node \"ip-10-0-129-50.ec2.internal\" DevicePath \"\"" Apr 17 20:24:44.088254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:44.088223 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-696d5c5dc7-4l9cv" Apr 17 20:24:44.115149 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:44.115109 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-696d5c5dc7-4l9cv"] Apr 17 20:24:44.119836 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:44.119807 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-696d5c5dc7-4l9cv"] Apr 17 20:24:45.339351 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:45.339304 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a6a19c-bec1-41d4-8b02-0243b61f5906" path="/var/lib/kubelet/pods/57a6a19c-bec1-41d4-8b02-0243b61f5906/volumes" Apr 17 20:24:57.402777 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.402739 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-579c6d859-qrv8d"] Apr 17 20:24:57.409281 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.409250 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:24:57.411846 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.411822 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-fwz42\"" Apr 17 20:24:57.414851 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.414826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-579c6d859-qrv8d"] Apr 17 20:24:57.508665 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.508630 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfdxs\" (UniqueName: \"kubernetes.io/projected/107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99-kube-api-access-wfdxs\") pod \"maas-controller-579c6d859-qrv8d\" (UID: \"107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99\") " pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:24:57.610055 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.610019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfdxs\" (UniqueName: \"kubernetes.io/projected/107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99-kube-api-access-wfdxs\") pod \"maas-controller-579c6d859-qrv8d\" (UID: \"107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99\") " pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:24:57.617838 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.617806 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfdxs\" (UniqueName: \"kubernetes.io/projected/107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99-kube-api-access-wfdxs\") pod \"maas-controller-579c6d859-qrv8d\" (UID: \"107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99\") " pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:24:57.722681 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.722587 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:24:57.845547 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:57.845517 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-579c6d859-qrv8d"] Apr 17 20:24:57.848027 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:24:57.848000 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod107ad4f4_ece5_4b9f_b3a9_5bf254eb0b99.slice/crio-ca83bc28363b1af0f28967b39c976d856d678a836491989d10f29f3eacad7a43 WatchSource:0}: Error finding container ca83bc28363b1af0f28967b39c976d856d678a836491989d10f29f3eacad7a43: Status 404 returned error can't find the container with id ca83bc28363b1af0f28967b39c976d856d678a836491989d10f29f3eacad7a43 Apr 17 20:24:58.132349 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:24:58.132312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-579c6d859-qrv8d" event={"ID":"107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99","Type":"ContainerStarted","Data":"ca83bc28363b1af0f28967b39c976d856d678a836491989d10f29f3eacad7a43"} Apr 17 20:25:00.139791 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:00.139763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-579c6d859-qrv8d" event={"ID":"107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99","Type":"ContainerStarted","Data":"929dbbc17303747e999c717a6d023771ea686e4cb9d46b83febab6a9a70d03f9"} Apr 17 20:25:00.140166 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:00.139905 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:25:00.155406 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:00.155346 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-579c6d859-qrv8d" podStartSLOduration=0.930494733 podStartE2EDuration="3.155325615s" podCreationTimestamp="2026-04-17 20:24:57 +0000 UTC" firstStartedPulling="2026-04-17 20:24:57.849241848 +0000 UTC m=+591.020941907" lastFinishedPulling="2026-04-17 20:25:00.07407272 +0000 UTC m=+593.245772789" observedRunningTime="2026-04-17 20:25:00.153862612 +0000 UTC m=+593.325562692" watchObservedRunningTime="2026-04-17 20:25:00.155325615 +0000 UTC m=+593.327025696" Apr 17 20:25:11.149099 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:11.149059 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-579c6d859-qrv8d" Apr 17 20:25:44.229526 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.229481 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv"] Apr 17 20:25:44.235751 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.235728 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.238097 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.238074 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 20:25:44.238975 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.238952 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 20:25:44.239082 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.238952 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 20:25:44.239082 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.238957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-vsxc9\"" Apr 17 20:25:44.243057 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.243035 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv"] Apr 17 20:25:44.294256 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.294226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.294455 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.294264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.294455 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.294333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.294455 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.294380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.294455 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.294408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l984m\" (UniqueName: \"kubernetes.io/projected/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-kube-api-access-l984m\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.294455 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.294448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395166 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395166 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395413 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395413 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395413 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l984m\" (UniqueName: \"kubernetes.io/projected/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-kube-api-access-l984m\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395413 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395605 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395659 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.395750 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.395728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.397673 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.397651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.398046 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.398026 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.403124 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.403104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l984m\" (UniqueName: \"kubernetes.io/projected/0b26aa56-031e-4fa7-b2b8-fea36dcfd935-kube-api-access-l984m\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv\" (UID: \"0b26aa56-031e-4fa7-b2b8-fea36dcfd935\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.547416 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.547378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:25:44.674769 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.674724 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv"] Apr 17 20:25:44.676982 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:25:44.676954 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b26aa56_031e_4fa7_b2b8_fea36dcfd935.slice/crio-13d3ce6ebd6f2f55ca6c7a30034d8264421c5b920ec9b1ba6133fc519191e5c2 WatchSource:0}: Error finding container 13d3ce6ebd6f2f55ca6c7a30034d8264421c5b920ec9b1ba6133fc519191e5c2: Status 404 returned error can't find the container with id 13d3ce6ebd6f2f55ca6c7a30034d8264421c5b920ec9b1ba6133fc519191e5c2 Apr 17 20:25:44.682605 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:44.681169 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:25:45.295458 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:45.295420 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerStarted","Data":"13d3ce6ebd6f2f55ca6c7a30034d8264421c5b920ec9b1ba6133fc519191e5c2"} Apr 17 20:25:52.323200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:52.323154 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerStarted","Data":"a2ee1dd803727b3fe502ec1fc533f0fe5012a8b87f67f18506307a68731a5a80"} Apr 17 20:25:56.717928 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.717859 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm"] Apr 17 20:25:56.721658 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.721634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.723632 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.723607 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 20:25:56.730766 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.730744 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm"] Apr 17 20:25:56.800224 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.800191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.800224 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.800231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.800425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.800253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.800425 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.800350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8t7q\" (UniqueName: \"kubernetes.io/projected/4d5aef40-c11b-45b3-b475-42b8cada4032-kube-api-access-k8t7q\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.800494 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.800429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5aef40-c11b-45b3-b475-42b8cada4032-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.800494 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.800470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.901335 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.901297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.901335 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.901349 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8t7q\" (UniqueName: \"kubernetes.io/projected/4d5aef40-c11b-45b3-b475-42b8cada4032-kube-api-access-k8t7q\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.901609 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.901395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5aef40-c11b-45b3-b475-42b8cada4032-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.901609 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.901421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.901609 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.901458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.901609 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.901486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.902249 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.902220 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-model-cache\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.902404 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.902267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-home\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.902587 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.902552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.904077 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.904033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4d5aef40-c11b-45b3-b475-42b8cada4032-dshm\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.904705 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.904674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5aef40-c11b-45b3-b475-42b8cada4032-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:56.910099 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:56.910074 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8t7q\" (UniqueName: \"kubernetes.io/projected/4d5aef40-c11b-45b3-b475-42b8cada4032-kube-api-access-k8t7q\") pod \"premium-simulated-simulated-premium-kserve-555d546bff-zwqwm\" (UID: \"4d5aef40-c11b-45b3-b475-42b8cada4032\") " pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:57.031951 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:57.031911 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:25:57.177185 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:57.177045 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm"] Apr 17 20:25:57.340080 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:57.339985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerStarted","Data":"3c2839593d3df7b39f2a2253de0b90d6b49b14845e2c28842aa02fa0c64bfb61"} Apr 17 20:25:57.340080 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:57.340019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerStarted","Data":"a0d28d6cab90e9393263b9799e31b81a83422959540411a5b269c366412c77c9"} Apr 17 20:25:57.341383 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:57.341353 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="a2ee1dd803727b3fe502ec1fc533f0fe5012a8b87f67f18506307a68731a5a80" exitCode=0 Apr 17 20:25:57.341522 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:57.341401 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"a2ee1dd803727b3fe502ec1fc533f0fe5012a8b87f67f18506307a68731a5a80"} Apr 17 20:25:59.349172 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:59.349146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/0.log" Apr 17 20:25:59.349603 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:59.349476 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="e6dae42775bb10820b343f600c9ed69f627fa6c3248bcb2e5c3cbeb40f43436e" exitCode=2 Apr 17 20:25:59.349603 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:59.349513 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"e6dae42775bb10820b343f600c9ed69f627fa6c3248bcb2e5c3cbeb40f43436e"} Apr 17 20:25:59.349818 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:25:59.349803 2571 scope.go:117] "RemoveContainer" containerID="e6dae42775bb10820b343f600c9ed69f627fa6c3248bcb2e5c3cbeb40f43436e" Apr 17 20:26:00.354793 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:00.354760 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/1.log" Apr 17 20:26:00.355225 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:00.355209 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/0.log" Apr 17 20:26:00.355604 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:00.355577 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="837fb27512979c889e8de6f0945e47d9ce41e27dbe13604d72c97bf4047fa9f6" exitCode=2 Apr 17 20:26:00.355711 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:00.355621 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"837fb27512979c889e8de6f0945e47d9ce41e27dbe13604d72c97bf4047fa9f6"} Apr 17 20:26:00.355711 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:00.355681 2571 scope.go:117] "RemoveContainer" containerID="e6dae42775bb10820b343f600c9ed69f627fa6c3248bcb2e5c3cbeb40f43436e" Apr 17 20:26:00.356157 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:00.356141 2571 scope.go:117] "RemoveContainer" containerID="837fb27512979c889e8de6f0945e47d9ce41e27dbe13604d72c97bf4047fa9f6" Apr 17 20:26:00.356396 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:00.356371 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:01.359737 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:01.359702 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/1.log" Apr 17 20:26:04.548531 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:04.548477 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:26:04.548531 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:04.548536 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:26:04.549135 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:04.549003 2571 scope.go:117] "RemoveContainer" containerID="837fb27512979c889e8de6f0945e47d9ce41e27dbe13604d72c97bf4047fa9f6" Apr 17 20:26:04.549263 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:04.549244 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:06.381412 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:06.380124 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="3c2839593d3df7b39f2a2253de0b90d6b49b14845e2c28842aa02fa0c64bfb61" exitCode=0 Apr 17 20:26:06.381412 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:06.380182 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"3c2839593d3df7b39f2a2253de0b90d6b49b14845e2c28842aa02fa0c64bfb61"} Apr 17 20:26:07.385522 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:07.385489 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/0.log" Apr 17 20:26:07.386008 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:07.385942 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="e556691cd39e1e0122722643fee04c0343b1e6576598e9058d643cb4b0567794" exitCode=2 Apr 17 20:26:07.386068 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:07.386030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"e556691cd39e1e0122722643fee04c0343b1e6576598e9058d643cb4b0567794"} Apr 17 20:26:07.386516 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:07.386499 2571 scope.go:117] "RemoveContainer" containerID="e556691cd39e1e0122722643fee04c0343b1e6576598e9058d643cb4b0567794" Apr 17 20:26:08.391746 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:08.391714 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/1.log" Apr 17 20:26:08.392159 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:08.392133 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/0.log" Apr 17 20:26:08.392479 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:08.392454 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="7cedee7305a826ecbee1407b4d6b0cc99bf5284a170e84bb1451e94deb865804" exitCode=2 Apr 17 20:26:08.392524 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:08.392493 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"7cedee7305a826ecbee1407b4d6b0cc99bf5284a170e84bb1451e94deb865804"} Apr 17 20:26:08.392559 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:08.392520 2571 scope.go:117] "RemoveContainer" containerID="e556691cd39e1e0122722643fee04c0343b1e6576598e9058d643cb4b0567794" Apr 17 20:26:08.393063 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:08.393044 2571 scope.go:117] "RemoveContainer" containerID="7cedee7305a826ecbee1407b4d6b0cc99bf5284a170e84bb1451e94deb865804" Apr 17 20:26:08.393300 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:08.393269 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:26:09.397182 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:09.397154 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/1.log" Apr 17 20:26:17.033026 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:17.032973 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:26:17.033026 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:17.033031 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:26:17.033552 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:17.033494 2571 scope.go:117] "RemoveContainer" containerID="7cedee7305a826ecbee1407b4d6b0cc99bf5284a170e84bb1451e94deb865804" Apr 17 20:26:17.033721 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:17.033701 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:26:18.335240 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:18.335203 2571 scope.go:117] "RemoveContainer" containerID="837fb27512979c889e8de6f0945e47d9ce41e27dbe13604d72c97bf4047fa9f6" Apr 17 20:26:19.431368 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:19.431338 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/2.log" Apr 17 20:26:19.431791 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:19.431714 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/1.log" Apr 17 20:26:19.432026 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:19.432006 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310" exitCode=2 Apr 17 20:26:19.432119 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:19.432088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310"} Apr 17 20:26:19.432176 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:19.432149 2571 scope.go:117] "RemoveContainer" containerID="837fb27512979c889e8de6f0945e47d9ce41e27dbe13604d72c97bf4047fa9f6" Apr 17 20:26:19.432596 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:19.432578 2571 scope.go:117] "RemoveContainer" containerID="2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310" Apr 17 20:26:19.432833 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:19.432816 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:20.436798 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:20.436765 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/2.log" Apr 17 20:26:24.547897 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:24.547845 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:26:24.548276 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:24.547961 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:26:24.548447 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:24.548429 2571 scope.go:117] "RemoveContainer" containerID="2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310" Apr 17 20:26:24.548645 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:24.548626 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:29.335467 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:29.335437 2571 scope.go:117] "RemoveContainer" containerID="7cedee7305a826ecbee1407b4d6b0cc99bf5284a170e84bb1451e94deb865804" Apr 17 20:26:30.470933 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:30.470907 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/2.log" Apr 17 20:26:30.471343 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:30.471287 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/1.log" Apr 17 20:26:30.471625 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:30.471603 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18" exitCode=2 Apr 17 20:26:30.471717 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:30.471683 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18"} Apr 17 20:26:30.471752 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:30.471726 2571 scope.go:117] "RemoveContainer" containerID="7cedee7305a826ecbee1407b4d6b0cc99bf5284a170e84bb1451e94deb865804" Apr 17 20:26:30.472131 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:30.472114 2571 scope.go:117] "RemoveContainer" containerID="05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18" Apr 17 20:26:30.472347 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:30.472327 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:26:31.476819 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:31.476790 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/2.log" Apr 17 20:26:35.335621 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:35.335580 2571 scope.go:117] "RemoveContainer" containerID="2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310" Apr 17 20:26:35.336206 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:35.335848 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:37.032466 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:37.032423 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:26:37.032873 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:37.032512 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:26:37.032950 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:37.032913 2571 scope.go:117] "RemoveContainer" containerID="05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18" Apr 17 20:26:37.033116 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:37.033100 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:26:37.498477 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:37.498403 2571 scope.go:117] "RemoveContainer" containerID="05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18" Apr 17 20:26:37.498621 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:37.498573 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:26:47.338553 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:47.338526 2571 scope.go:117] "RemoveContainer" containerID="2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310" Apr 17 20:26:48.541221 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:48.541194 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/3.log" Apr 17 20:26:48.541670 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:48.541588 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/2.log" Apr 17 20:26:48.541931 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:48.541909 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" exitCode=2 Apr 17 20:26:48.542004 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:48.541946 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6"} Apr 17 20:26:48.542004 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:48.541975 2571 scope.go:117] "RemoveContainer" containerID="2461b4b5f1e4c2f697ef52b4716b9a35c04dcb5e4df1db30e10c92b5dedda310" Apr 17 20:26:48.542368 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:48.542349 2571 scope.go:117] "RemoveContainer" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" Apr 17 20:26:48.542585 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:48.542565 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:49.546527 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:49.546501 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/3.log" Apr 17 20:26:51.335705 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:51.335661 2571 scope.go:117] "RemoveContainer" containerID="05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18" Apr 17 20:26:52.558429 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:52.558398 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/3.log" Apr 17 20:26:52.558840 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:52.558796 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/2.log" Apr 17 20:26:52.559129 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:52.559108 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" exitCode=2 Apr 17 20:26:52.559182 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:52.559159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f"} Apr 17 20:26:52.559216 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:52.559190 2571 scope.go:117] "RemoveContainer" containerID="05f951c76275e9edcf1b1c48376b84f5c91f4fb24ec8ef8262e7ad1f64ef8d18" Apr 17 20:26:52.559587 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:52.559566 2571 scope.go:117] "RemoveContainer" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" Apr 17 20:26:52.559805 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:52.559783 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:26:53.564106 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:53.564076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/3.log" Apr 17 20:26:54.547655 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:54.547608 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:26:54.547655 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:54.547660 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:26:54.548155 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:54.548141 2571 scope.go:117] "RemoveContainer" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" Apr 17 20:26:54.548366 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:54.548341 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:26:57.032989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:57.032951 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:26:57.032989 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:57.032996 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:26:57.033478 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:26:57.033365 2571 scope.go:117] "RemoveContainer" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" Apr 17 20:26:57.033549 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:26:57.033531 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:27:09.334858 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:09.334825 2571 scope.go:117] "RemoveContainer" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" Apr 17 20:27:09.335260 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:09.335039 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:27:12.334680 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:12.334636 2571 scope.go:117] "RemoveContainer" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" Apr 17 20:27:12.335150 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:12.334917 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:27:20.335115 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:20.335077 2571 scope.go:117] "RemoveContainer" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" Apr 17 20:27:20.335600 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:20.335266 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:27:26.335613 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:26.335572 2571 scope.go:117] "RemoveContainer" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" Apr 17 20:27:26.336140 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:26.335843 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:27:33.335140 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:33.335105 2571 scope.go:117] "RemoveContainer" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" Apr 17 20:27:34.702462 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:34.702432 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/4.log" Apr 17 20:27:34.702905 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:34.702796 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/3.log" Apr 17 20:27:34.703137 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:34.703111 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" exitCode=2 Apr 17 20:27:34.703220 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:34.703149 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027"} Apr 17 20:27:34.703220 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:34.703196 2571 scope.go:117] "RemoveContainer" containerID="8b4bc6b14548e684e1f9e2595ffb3797bd1dbe3480d8ee7522d50cd70b86e5a6" Apr 17 20:27:34.703608 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:34.703589 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:27:34.703848 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:34.703825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:27:35.707584 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:35.707559 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/4.log" Apr 17 20:27:38.335076 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.335042 2571 scope.go:117] "RemoveContainer" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" Apr 17 20:27:38.719324 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.719294 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/4.log" Apr 17 20:27:38.719689 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.719667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/3.log" Apr 17 20:27:38.720025 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.720004 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" exitCode=2 Apr 17 20:27:38.720098 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.720075 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f"} Apr 17 20:27:38.720142 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.720106 2571 scope.go:117] "RemoveContainer" containerID="d8d669046628db59411d62d0b4ed3c7f34ec1f0810b316fd1639519c4a86572f" Apr 17 20:27:38.720617 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:38.720594 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:27:38.720837 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:38.720817 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:27:39.724459 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:39.724429 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/4.log" Apr 17 20:27:44.548242 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:44.548199 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:27:44.548242 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:44.548237 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:27:44.548699 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:44.548658 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:27:44.548846 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:44.548827 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:27:47.032591 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:47.032545 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:27:47.032591 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:47.032597 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:27:47.033066 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:47.033025 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:27:47.033214 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:47.033195 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:27:58.334708 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:58.334675 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:27:58.335211 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:58.334908 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:27:59.335014 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:27:59.334980 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:27:59.335481 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:27:59.335182 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:28:09.335379 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:09.335334 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:28:09.335874 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:09.335584 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:28:14.335148 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:14.335109 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:28:14.335559 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:14.335320 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:28:20.334977 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:20.334938 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:28:20.335372 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:20.335127 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:28:28.335069 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:28.334980 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:28:28.335575 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:28.335230 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:28:32.335697 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:32.335654 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:28:32.336113 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:32.335927 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:28:43.335245 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:43.335209 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:28:43.335658 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:43.335404 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:28:45.335521 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:45.335482 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:28:45.336003 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:45.335719 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:28:56.334889 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.334856 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:28:56.983718 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.983625 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/5.log" Apr 17 20:28:56.984028 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.984010 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/4.log" Apr 17 20:28:56.984310 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.984290 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" exitCode=2 Apr 17 20:28:56.984381 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.984323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" event={"ID":"0b26aa56-031e-4fa7-b2b8-fea36dcfd935","Type":"ContainerDied","Data":"e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c"} Apr 17 20:28:56.984381 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.984350 2571 scope.go:117] "RemoveContainer" containerID="588267b1e943c84de8a6c2d84c5ea930199573986faa4a78fb6c108c9dd3e027" Apr 17 20:28:56.984808 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:56.984775 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:28:56.985043 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:56.985023 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:28:57.989776 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:57.989748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/5.log" Apr 17 20:28:58.335272 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:28:58.335239 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:28:58.335454 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:28:58.335434 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:29:04.547627 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:04.547583 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:29:04.547627 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:04.547631 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" Apr 17 20:29:04.548184 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:04.548111 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:29:04.548308 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:04.548290 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:29:11.335657 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:11.335618 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:29:12.039633 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:12.039608 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/5.log" Apr 17 20:29:12.040038 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:12.040020 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/4.log" Apr 17 20:29:12.040319 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:12.040300 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d5aef40-c11b-45b3-b475-42b8cada4032" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" exitCode=2 Apr 17 20:29:12.040401 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:12.040378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" event={"ID":"4d5aef40-c11b-45b3-b475-42b8cada4032","Type":"ContainerDied","Data":"efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010"} Apr 17 20:29:12.040456 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:12.040426 2571 scope.go:117] "RemoveContainer" containerID="226dfce3737eefa9124f202a170a7aa8059f103dcb91f9291794cc242983578f" Apr 17 20:29:12.040807 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:12.040789 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:29:12.041059 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:12.041038 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:29:13.045222 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:13.045190 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/5.log" Apr 17 20:29:16.335002 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:16.334967 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:29:16.335396 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:16.335170 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:29:17.033098 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:17.033055 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:29:17.033098 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:17.033105 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" Apr 17 20:29:17.033557 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:17.033541 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:29:17.033736 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:17.033719 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:29:30.335450 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:30.335418 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:29:30.335976 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:30.335613 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:29:32.335126 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:32.335094 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:29:32.335500 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:32.335249 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:29:42.335725 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:42.335672 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:29:42.336230 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:42.335919 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:29:45.335611 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:45.335576 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:29:45.336021 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:45.335794 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:29:55.335859 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:55.335762 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:29:55.336344 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:55.335980 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:29:59.335320 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:29:59.335287 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:29:59.335717 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:29:59.335476 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:30:07.275263 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:07.275232 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/5.log" Apr 17 20:30:07.275956 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:07.275938 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/5.log" Apr 17 20:30:07.276898 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:07.276864 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/5.log" Apr 17 20:30:07.277517 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:07.277499 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/5.log" Apr 17 20:30:09.335707 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:09.335666 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:30:09.336114 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:09.335835 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:30:10.335745 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:10.335711 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:30:10.336213 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:10.335924 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:30:21.335471 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:21.335432 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:30:21.335960 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:21.335628 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:30:22.335443 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:22.335406 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:30:22.335625 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:22.335603 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:30:33.334739 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:33.334694 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:30:33.335255 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:33.334859 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:30:34.335443 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:34.335409 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:30:34.335834 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:34.335607 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:30:35.829174 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:35.829136 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2dkmq_69f17798-07e0-48af-9329-7c19d18d34d6/manager/0.log" Apr 17 20:30:36.168575 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:36.168463 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-579c6d859-qrv8d_107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99/manager/0.log" Apr 17 20:30:36.277854 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:36.277825 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6phsz_de5bc5da-6f25-4bfc-95fb-1d3be17d3f11/manager/2.log" Apr 17 20:30:36.505536 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:36.505449 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-799c8bc7d9-5x2bk_038276ad-1ad9-48e1-9463-f2805ad83aba/manager/0.log" Apr 17 20:30:38.025698 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:38.025667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-cd2h9_675a63a5-e8b3-4fe4-9200-0151facc0074/manager/0.log" Apr 17 20:30:38.456200 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:38.456112 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-8q97f_bff9c9aa-45c5-41c6-a95a-641918b7bec0/manager/0.log" Apr 17 20:30:39.088536 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:39.088502 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mf7bv_a7d42755-1cef-4602-a268-c0c0152aed8b/discovery/0.log" Apr 17 20:30:39.190042 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:39.190012 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-686dfbc75c-zfx2b_bb2e5b1d-3bc7-4289-be66-337299e352eb/kube-auth-proxy/0.log" Apr 17 20:30:40.155377 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:40.155346 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/storage-initializer/0.log" Apr 17 20:30:40.161297 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:40.161267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_0b26aa56-031e-4fa7-b2b8-fea36dcfd935/main/5.log" Apr 17 20:30:40.383870 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:40.383839 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/storage-initializer/0.log" Apr 17 20:30:40.389676 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:40.389656 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_4d5aef40-c11b-45b3-b475-42b8cada4032/main/5.log" Apr 17 20:30:46.334696 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:46.334661 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:30:46.337085 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:46.334764 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:30:46.337085 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:46.334841 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:30:46.337085 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:46.334982 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:30:47.314186 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:47.314152 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-92pvt_30a5e026-7834-4983-8c72-94991eb23377/global-pull-secret-syncer/0.log" Apr 17 20:30:47.459543 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:47.459494 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vhbpf_785f90d6-d02c-465a-8c2b-761d1fb1e10b/konnectivity-agent/0.log" Apr 17 20:30:47.499646 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:47.499615 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-50.ec2.internal_06b467c1ac790ac275ddafdf170566ce/haproxy/0.log" Apr 17 20:30:51.940568 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:51.940540 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-cd2h9_675a63a5-e8b3-4fe4-9200-0151facc0074/manager/0.log" Apr 17 20:30:52.063115 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:52.063077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-8q97f_bff9c9aa-45c5-41c6-a95a-641918b7bec0/manager/0.log" Apr 17 20:30:54.112483 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:54.112456 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zmmdh_0c7dcbd5-6f69-40f5-85e8-13bd474164db/node-exporter/0.log" Apr 17 20:30:54.132183 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:54.132157 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zmmdh_0c7dcbd5-6f69-40f5-85e8-13bd474164db/kube-rbac-proxy/0.log" Apr 17 20:30:54.150739 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:54.150708 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zmmdh_0c7dcbd5-6f69-40f5-85e8-13bd474164db/init-textfile/0.log" Apr 17 20:30:56.039501 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.039463 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4"] Apr 17 20:30:56.042793 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.042772 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.044813 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.044786 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-klrcl\"/\"default-dockercfg-b2tbp\"" Apr 17 20:30:56.044813 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.044807 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klrcl\"/\"openshift-service-ca.crt\"" Apr 17 20:30:56.044990 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.044837 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klrcl\"/\"kube-root-ca.crt\"" Apr 17 20:30:56.051543 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.051517 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4"] Apr 17 20:30:56.142937 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.142875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-lib-modules\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.143104 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.142970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-sys\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.143104 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.143018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-proc\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.143104 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.143079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-podres\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.143209 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.143118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctg6x\" (UniqueName: \"kubernetes.io/projected/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-kube-api-access-ctg6x\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244549 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-sys\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244549 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-proc\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244797 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-podres\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244797 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctg6x\" (UniqueName: \"kubernetes.io/projected/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-kube-api-access-ctg6x\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244797 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-sys\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244797 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244719 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-proc\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244797 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-lib-modules\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.244797 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-podres\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.245054 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.244822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-lib-modules\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.251591 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.251564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctg6x\" (UniqueName: \"kubernetes.io/projected/d7be0af0-79d4-4dcd-b6e4-badccda06e9f-kube-api-access-ctg6x\") pod \"perf-node-gather-daemonset-ckql4\" (UID: \"d7be0af0-79d4-4dcd-b6e4-badccda06e9f\") " pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.354235 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.354134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:56.482660 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.482627 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4"] Apr 17 20:30:56.484009 ip-10-0-129-50 kubenswrapper[2571]: W0417 20:30:56.483979 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd7be0af0_79d4_4dcd_b6e4_badccda06e9f.slice/crio-31e0ef3a729d1a03fade298668576c836f54855deb1914438fe0634e436cf6fa WatchSource:0}: Error finding container 31e0ef3a729d1a03fade298668576c836f54855deb1914438fe0634e436cf6fa: Status 404 returned error can't find the container with id 31e0ef3a729d1a03fade298668576c836f54855deb1914438fe0634e436cf6fa Apr 17 20:30:56.486046 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:56.486021 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:30:57.337778 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:57.337752 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:30:57.338168 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:57.337828 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:30:57.338168 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:57.337959 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:30:57.338168 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:30:57.337993 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:30:57.402188 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:57.402150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" event={"ID":"d7be0af0-79d4-4dcd-b6e4-badccda06e9f","Type":"ContainerStarted","Data":"dcaea68aedc546ce75e52f5bf6dc20eee2f922fa13d910fa56bc20ec13d40ef3"} Apr 17 20:30:57.402188 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:57.402190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" event={"ID":"d7be0af0-79d4-4dcd-b6e4-badccda06e9f","Type":"ContainerStarted","Data":"31e0ef3a729d1a03fade298668576c836f54855deb1914438fe0634e436cf6fa"} Apr 17 20:30:57.402452 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:57.402298 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:30:57.417514 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:57.417469 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" podStartSLOduration=1.4174540549999999 podStartE2EDuration="1.417454055s" podCreationTimestamp="2026-04-17 20:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:30:57.415491338 +0000 UTC m=+950.587191419" watchObservedRunningTime="2026-04-17 20:30:57.417454055 +0000 UTC m=+950.589154136" Apr 17 20:30:58.110225 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:58.110194 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qzgfk_0ac23e99-190f-4c9d-80e2-1ba8d936a9f1/dns/0.log" Apr 17 20:30:58.130480 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:58.130454 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qzgfk_0ac23e99-190f-4c9d-80e2-1ba8d936a9f1/kube-rbac-proxy/0.log" Apr 17 20:30:58.218946 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:58.218912 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kjf7q_eeee41e0-c56f-4312-97f4-258ec5fc4d4d/dns-node-resolver/0.log" Apr 17 20:30:58.733525 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:58.733491 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n66bj_e2f38744-a4e3-4cc0-94fe-0190e5c2c772/node-ca/0.log" Apr 17 20:30:59.561051 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:59.561022 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mf7bv_a7d42755-1cef-4602-a268-c0c0152aed8b/discovery/0.log" Apr 17 20:30:59.578784 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:30:59.578756 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-686dfbc75c-zfx2b_bb2e5b1d-3bc7-4289-be66-337299e352eb/kube-auth-proxy/0.log" Apr 17 20:31:00.158782 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:00.158750 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-thggh_84435488-0f4e-4e67-98ca-b76e36c0b583/serve-healthcheck-canary/0.log" Apr 17 20:31:00.757261 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:00.757229 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqfpr_cd3ca34b-26ea-4674-bf57-67d8b016f2dd/kube-rbac-proxy/0.log" Apr 17 20:31:00.775582 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:00.775538 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqfpr_cd3ca34b-26ea-4674-bf57-67d8b016f2dd/exporter/0.log" Apr 17 20:31:00.794547 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:00.794524 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqfpr_cd3ca34b-26ea-4674-bf57-67d8b016f2dd/extractor/0.log" Apr 17 20:31:02.594530 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:02.594500 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2dkmq_69f17798-07e0-48af-9329-7c19d18d34d6/manager/0.log" Apr 17 20:31:02.658021 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:02.657994 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-579c6d859-qrv8d_107ad4f4-ece5-4b9f-b3a9-5bf254eb0b99/manager/0.log" Apr 17 20:31:02.678602 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:02.678570 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6phsz_de5bc5da-6f25-4bfc-95fb-1d3be17d3f11/manager/1.log" Apr 17 20:31:02.689040 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:02.689015 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6phsz_de5bc5da-6f25-4bfc-95fb-1d3be17d3f11/manager/2.log" Apr 17 20:31:02.759868 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:02.759832 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-799c8bc7d9-5x2bk_038276ad-1ad9-48e1-9463-f2805ad83aba/manager/0.log" Apr 17 20:31:03.416836 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:03.416797 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-klrcl/perf-node-gather-daemonset-ckql4" Apr 17 20:31:03.950688 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:03.950653 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-9q62g_a3391230-98fc-4d37-8cc0-4986d17eb994/openshift-lws-operator/0.log" Apr 17 20:31:09.634874 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.634841 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9qqd7_4b2daba4-d463-4f69-96fb-a701b6be9b79/kube-multus/0.log" Apr 17 20:31:09.662556 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.662525 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/kube-multus-additional-cni-plugins/0.log" Apr 17 20:31:09.682729 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.682696 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/egress-router-binary-copy/0.log" Apr 17 20:31:09.702254 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.702226 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/cni-plugins/0.log" Apr 17 20:31:09.721157 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.721135 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/bond-cni-plugin/0.log" Apr 17 20:31:09.742498 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.742469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/routeoverride-cni/0.log" Apr 17 20:31:09.764149 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.764124 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/whereabouts-cni-bincopy/0.log" Apr 17 20:31:09.783528 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:09.783489 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ks7qb_5b1dd3fa-14f5-434c-919f-b8ceabccf2b6/whereabouts-cni/0.log" Apr 17 20:31:10.275262 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:10.275237 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pwx7f_85885202-a740-4319-827b-236ded2de085/network-metrics-daemon/0.log" Apr 17 20:31:10.295308 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:10.295280 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pwx7f_85885202-a740-4319-827b-236ded2de085/kube-rbac-proxy/0.log" Apr 17 20:31:11.335375 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.335342 2571 scope.go:117] "RemoveContainer" containerID="efbaf2f6a024f68c063b3115cd0a99afe21beef35ae1dd166a0c8b243e1d9010" Apr 17 20:31:11.335920 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.335498 2571 scope.go:117] "RemoveContainer" containerID="e29617ac3c8b59f45071d506840427e9bdf579d7e60709627c3faad379e0ea3c" Apr 17 20:31:11.335920 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:31:11.335621 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=premium-simulated-simulated-premium-kserve-555d546bff-zwqwm_llm(4d5aef40-c11b-45b3-b475-42b8cada4032)\"" pod="llm/premium-simulated-simulated-premium-kserve-555d546bff-zwqwm" podUID="4d5aef40-c11b-45b3-b475-42b8cada4032" Apr 17 20:31:11.335920 ip-10-0-129-50 kubenswrapper[2571]: E0417 20:31:11.335790 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv_llm(0b26aa56-031e-4fa7-b2b8-fea36dcfd935)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c848hlv" podUID="0b26aa56-031e-4fa7-b2b8-fea36dcfd935" Apr 17 20:31:11.381796 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.381763 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/ovn-controller/0.log" Apr 17 20:31:11.404845 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.404815 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/ovn-acl-logging/0.log" Apr 17 20:31:11.423443 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.423418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/kube-rbac-proxy-node/0.log" Apr 17 20:31:11.442441 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.442414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 20:31:11.459597 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.459569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/northd/0.log" Apr 17 20:31:11.479334 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.479297 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/nbdb/0.log" Apr 17 20:31:11.498028 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.498002 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/sbdb/0.log" Apr 17 20:31:11.591963 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:11.591871 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmwcg_fcf008c7-b5c3-4d78-9dcc-a9522d8e0a26/ovnkube-controller/0.log" Apr 17 20:31:12.987272 ip-10-0-129-50 kubenswrapper[2571]: I0417 20:31:12.987243 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-z2fcw_261e1de3-1829-4520-b7ef-6bb874d9f16e/network-check-target-container/0.log"