Apr 16 22:02:09.026368 ip-10-0-129-68 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:02:09.026382 ip-10-0-129-68 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:02:09.026392 ip-10-0-129-68 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:02:09.026812 ip-10-0-129-68 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:02:19.155929 ip-10-0-129-68 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:02:19.155946 ip-10-0-129-68 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 632426711b444ba59023d31052d733fa -- Apr 16 22:04:48.401036 ip-10-0-129-68 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:04:48.870574 ip-10-0-129-68 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:48.870574 ip-10-0-129-68 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:04:48.870574 ip-10-0-129-68 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:48.870574 ip-10-0-129-68 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:04:48.870574 ip-10-0-129-68 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:48.871250 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.870626 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:04:48.874268 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874252 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:48.874268 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874268 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874272 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874276 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874279 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874282 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874285 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874288 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874291 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874293 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874296 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874299 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874302 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874305 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874308 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874311 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874314 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874317 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874320 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874323 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874325 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:48.874334 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874328 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874331 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874334 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874337 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874340 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874343 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874345 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874348 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874351 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874353 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874356 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874358 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874361 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874363 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874366 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874370 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874374 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874377 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874380 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874384 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:48.874827 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874387 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874390 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874392 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874395 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874398 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874401 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874403 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874406 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874408 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874411 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874414 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874416 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874419 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874421 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874425 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874427 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874430 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874432 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874435 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:48.875375 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874437 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874440 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874442 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874445 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874448 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874450 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874453 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874456 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874458 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874462 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874467 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874470 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874473 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874476 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874482 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874484 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874487 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874490 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874493 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874495 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:48.875866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874498 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874501 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874504 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874508 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874512 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874516 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874932 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874938 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874941 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874944 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874946 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874951 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874955 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874958 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874961 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874964 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874966 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874969 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874972 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:48.876336 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874974 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874976 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874979 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874982 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874984 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874988 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874990 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874994 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874996 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.874999 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875001 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875004 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875006 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875009 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875012 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875014 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875017 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875020 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875025 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:48.876828 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875028 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875032 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875035 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875038 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875042 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875047 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875051 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875055 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875058 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875062 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875065 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875068 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875070 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875073 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875076 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875078 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875081 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875085 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875087 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875090 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:48.877305 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875093 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875102 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875105 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875107 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875110 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875112 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875115 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875118 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875120 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875123 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875125 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875128 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875130 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875133 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875136 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875138 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875140 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875143 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875145 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875148 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:48.877866 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875150 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875153 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875155 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875158 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875161 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875163 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875166 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875169 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875171 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875174 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875177 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875179 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875182 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.875184 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876577 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876587 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876596 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876601 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876606 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876610 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876615 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:04:48.878384 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876620 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876623 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876626 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876629 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876633 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876636 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876639 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876642 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876645 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876648 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876650 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876653 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876660 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876663 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876666 2576 flags.go:64] FLAG: --config-dir="" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876670 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876673 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876677 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876680 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876684 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876732 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876739 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876743 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876747 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876752 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:04:48.878914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876756 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876761 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876765 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876768 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876772 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876775 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876778 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876784 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876788 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876791 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876794 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876798 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876802 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876806 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876809 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876812 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876815 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876818 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876821 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876824 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876827 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876830 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876833 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876837 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876841 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:04:48.879513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876844 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876848 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876852 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876877 2576 flags.go:64] FLAG: --help="false" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876880 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-129-68.ec2.internal" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876884 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876887 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876890 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876894 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876897 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876900 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876903 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876906 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876909 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876912 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876915 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876919 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876922 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876924 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876928 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876930 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876934 2576 flags.go:64] FLAG: --lock-file="" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876936 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876939 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:04:48.880124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876943 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876957 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876960 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876963 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876966 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876969 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876972 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876975 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876978 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876983 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876987 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876992 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876995 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.876999 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877002 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877006 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877009 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877012 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877015 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877022 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877025 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877028 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877031 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:04:48.880709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877034 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877040 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877043 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877046 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877049 2576 flags.go:64] FLAG: --port="10250" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877052 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877055 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-051afa9ef065463e5" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877058 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877061 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877064 2576 flags.go:64] FLAG: --register-node="true" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877067 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877070 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877074 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877077 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877080 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877083 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877087 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877090 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877093 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877096 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877104 2576 flags.go:64] FLAG: --runonce="false" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877107 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877110 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877113 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877116 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877119 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:04:48.881245 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877122 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877125 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877129 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877132 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877135 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877138 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877141 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877144 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877147 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877150 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877155 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877158 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877161 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877166 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877169 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877172 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877175 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877177 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877180 2576 flags.go:64] FLAG: --v="2" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877185 2576 flags.go:64] FLAG: --version="false" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877189 2576 flags.go:64] FLAG: --vmodule="" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877193 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.877197 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877301 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877305 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:48.881909 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877308 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877313 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877317 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877321 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877325 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877329 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877332 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877335 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877338 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877340 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877343 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877345 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877348 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877351 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877353 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877356 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877358 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877361 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877363 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877366 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:48.882519 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877369 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877372 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877374 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877377 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877379 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877382 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877384 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877387 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877390 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877392 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877395 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877397 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877400 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877404 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877406 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877409 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877412 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877415 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877418 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877420 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:48.883044 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877422 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877425 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877427 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877430 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877432 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877435 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877437 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877440 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877442 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877445 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877447 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877450 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877452 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877455 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877458 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877460 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877463 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877465 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877468 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877470 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:48.883558 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877473 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877475 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877478 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877480 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877483 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877485 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877488 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877490 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877492 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877495 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877498 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877501 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877503 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877505 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877508 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877511 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877513 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877515 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877518 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:48.884071 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877520 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:48.884529 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877524 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:48.884529 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877528 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:48.884529 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877531 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:48.884529 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.877533 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:48.884529 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.878178 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:48.886817 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.886677 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:04:48.886817 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.886814 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886867 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886873 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886877 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886880 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886883 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886886 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886889 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886892 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886895 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886897 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886900 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886903 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886905 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886908 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886910 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886913 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886915 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886918 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886921 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:48.886944 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886923 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886926 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886929 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886932 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886935 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886938 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886941 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886944 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886946 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886951 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886956 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886960 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886962 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886966 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886968 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886971 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886974 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886977 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886980 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:48.887438 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886983 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886986 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886988 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886991 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886993 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886996 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.886998 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887001 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887003 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887006 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887009 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887011 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887014 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887016 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887019 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887022 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887025 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887028 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887030 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887033 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:48.887927 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887035 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887039 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887042 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887045 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887047 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887050 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887052 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887055 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887058 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887061 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887063 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887066 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887069 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887071 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887074 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887076 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887079 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887082 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887085 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887088 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:48.888431 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887091 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887093 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887096 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887098 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887101 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887104 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887107 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887110 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.887116 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887209 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887214 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887217 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887219 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887222 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887224 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:48.888958 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887227 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887229 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887232 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887234 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887238 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887240 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887243 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887245 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887248 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887251 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887254 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887258 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887261 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887264 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887267 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887270 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887273 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887276 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887278 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:48.889368 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887281 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887284 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887286 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887289 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887292 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887294 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887297 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887300 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887303 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887305 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887308 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887310 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887313 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887315 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887318 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887320 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887323 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887325 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887328 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:48.889849 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887331 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887334 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887337 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887339 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887342 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887345 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887347 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887350 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887352 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887355 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887357 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887360 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887362 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887365 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887368 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887370 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887373 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887375 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887377 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887380 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:48.890310 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887383 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887386 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887388 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887391 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887393 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887396 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887398 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887401 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887404 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887406 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887409 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887411 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887414 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887416 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887419 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887422 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887424 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887427 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887429 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887432 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:48.890808 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887434 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:48.891287 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:48.887437 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:48.891287 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.887442 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:48.891287 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.887559 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:04:48.891287 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.890368 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:04:48.891287 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.891272 2576 server.go:1019] "Starting client certificate rotation" Apr 16 22:04:48.891424 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.891365 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:04:48.891424 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.891408 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:04:48.919686 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.919663 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:04:48.924273 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.924253 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:04:48.936315 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.936289 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:04:48.942547 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.942529 2576 log.go:25] "Validated CRI v1 image API" Apr 16 22:04:48.944910 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.944879 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:04:48.947784 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.947755 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:04:48.949178 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.949151 2576 fs.go:135] Filesystem UUIDs: map[0c7a0568-c4ad-4df5-b996-d654980475ba:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ac4eac2a-c7a6-411e-bbe8-8753a2b7ac5e:/dev/nvme0n1p4] Apr 16 22:04:48.949251 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.949177 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:04:48.955332 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955215 2576 manager.go:217] Machine: {Timestamp:2026-04-16 22:04:48.954093915 +0000 UTC m=+0.427472678 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101563 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24539687b06908ba0cee0ad8d78d47 SystemUUID:ec245396-87b0-6908-ba0c-ee0ad8d78d47 BootID:63242671-1b44-4ba5-9023-d31052d733fa Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a5:58:cc:45:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a5:58:cc:45:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:70:17:d5:75:86 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:04:48.955332 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955320 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:04:48.955471 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955408 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:04:48.955806 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955784 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:04:48.955954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955808 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-68.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:04:48.956000 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955966 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:04:48.956000 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955975 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:04:48.956000 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.955987 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:04:48.956778 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.956768 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:04:48.957682 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.957673 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:04:48.957815 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.957805 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:04:48.960585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.960575 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:04:48.960632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.960589 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:04:48.960632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.960602 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:04:48.960632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.960612 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:04:48.960632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.960621 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:04:48.962593 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.962580 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:04:48.962652 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.962603 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:04:48.965828 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.965812 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:04:48.967312 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.967298 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:04:48.969090 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969076 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:04:48.969137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969100 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:04:48.969137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969110 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:04:48.969137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969119 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:04:48.969137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969127 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:04:48.969137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969137 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:04:48.969269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969146 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:04:48.969269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969155 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:04:48.969269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969165 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:04:48.969269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969175 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:04:48.969269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969189 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:04:48.969269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.969203 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:04:48.970066 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.970052 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:04:48.970133 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.970068 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:04:48.973468 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.973454 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:04:48.973548 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.973500 2576 server.go:1295] "Started kubelet" Apr 16 22:04:48.973933 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:48.973911 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:04:48.974033 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:48.974014 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:04:48.974146 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.974125 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-68.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:04:48.974177 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.974154 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:04:48.974209 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.974147 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:04:48.974250 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.974228 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:04:48.974488 ip-10-0-129-68 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:04:48.975947 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.975927 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:04:48.977382 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.977366 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:04:48.980120 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:48.978797 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-68.ec2.internal.18a6f58cff67f255 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-68.ec2.internal,UID:ip-10-0-129-68.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-68.ec2.internal,},FirstTimestamp:2026-04-16 22:04:48.973468245 +0000 UTC m=+0.446847010,LastTimestamp:2026-04-16 22:04:48.973468245 +0000 UTC m=+0.446847010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-68.ec2.internal,}" Apr 16 22:04:48.980877 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.980857 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:04:48.980877 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.980871 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:04:48.981896 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.981878 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:04:48.982012 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.981880 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:04:48.982105 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982090 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:04:48.982164 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:48.982033 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:48.982210 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982166 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:04:48.982210 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982175 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:04:48.982548 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982530 2576 factory.go:55] Registering systemd factory Apr 16 22:04:48.982613 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982558 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:04:48.982865 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982847 2576 factory.go:153] Registering CRI-O factory Apr 16 22:04:48.982865 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982867 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 22:04:48.983010 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:48.982892 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 22:04:48.983010 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982914 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:04:48.983010 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982941 2576 factory.go:103] Registering Raw factory Apr 16 22:04:48.983010 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.982954 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 22:04:48.983383 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.983365 2576 manager.go:319] Starting recovery of all containers Apr 16 22:04:48.986667 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.986643 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5g777" Apr 16 22:04:48.990299 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:48.990270 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:04:48.994196 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.994173 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5g777" Apr 16 22:04:48.996804 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:48.996785 2576 manager.go:324] Recovery completed Apr 16 22:04:49.000906 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.000892 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:49.001250 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.001231 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-68.ec2.internal\" not found" node="ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.003095 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.003082 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:49.003160 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.003107 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:49.003160 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.003116 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:49.003612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.003596 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:04:49.003612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.003609 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:04:49.003728 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.003624 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:04:49.005887 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.005875 2576 policy_none.go:49] "None policy: Start" Apr 16 22:04:49.005943 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.005899 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:04:49.005943 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.005910 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.039830 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.039858 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.039868 2576 server.go:85] "Starting device plugin registration server" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.040109 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.040122 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.040235 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.040338 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.040346 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.041090 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:04:49.062853 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.041144 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.130916 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.130849 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:04:49.132188 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.132170 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:04:49.132270 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.132209 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:04:49.132322 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.132291 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:04:49.132322 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.132299 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:04:49.132423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.132396 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:04:49.135637 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.135613 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:49.140667 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.140652 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:49.141839 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.141823 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:49.141939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.141858 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:49.141939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.141872 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:49.141939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.141901 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.151095 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.151081 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.151154 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.151103 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-68.ec2.internal\": node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.165956 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.165931 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.233426 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.233393 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal"] Apr 16 22:04:49.233536 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.233498 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:49.234420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.234391 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:49.234509 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.234432 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:49.234509 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.234444 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:49.236826 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.236814 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:49.236956 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.236931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.236993 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.236970 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:49.237522 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.237503 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:49.237624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.237523 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:49.237624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.237546 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:49.237624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.237558 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:49.237624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.237529 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:49.237624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.237617 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:49.239685 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.239672 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.239771 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.239709 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:49.240445 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.240429 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:49.240509 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.240457 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:49.240509 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.240470 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:49.262294 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.262276 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-68.ec2.internal\" not found" node="ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.266601 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.266586 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.266813 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.266802 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-68.ec2.internal\" not found" node="ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.283874 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.283850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca5d577ea59a9b5f03fad9ebac668547-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal\" (UID: \"ca5d577ea59a9b5f03fad9ebac668547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.283968 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.283880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5d577ea59a9b5f03fad9ebac668547-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal\" (UID: \"ca5d577ea59a9b5f03fad9ebac668547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.283968 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.283898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/67efa12b1e612144155ca84bcb8df9e9-config\") pod \"kube-apiserver-proxy-ip-10-0-129-68.ec2.internal\" (UID: \"67efa12b1e612144155ca84bcb8df9e9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.367396 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.367353 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.384867 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.384808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5d577ea59a9b5f03fad9ebac668547-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal\" (UID: \"ca5d577ea59a9b5f03fad9ebac668547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.384867 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.384843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/67efa12b1e612144155ca84bcb8df9e9-config\") pod \"kube-apiserver-proxy-ip-10-0-129-68.ec2.internal\" (UID: \"67efa12b1e612144155ca84bcb8df9e9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.384998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.384869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca5d577ea59a9b5f03fad9ebac668547-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal\" (UID: \"ca5d577ea59a9b5f03fad9ebac668547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.384998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.384898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/67efa12b1e612144155ca84bcb8df9e9-config\") pod \"kube-apiserver-proxy-ip-10-0-129-68.ec2.internal\" (UID: \"67efa12b1e612144155ca84bcb8df9e9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.384998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.384912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5d577ea59a9b5f03fad9ebac668547-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal\" (UID: \"ca5d577ea59a9b5f03fad9ebac668547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.384998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.384933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca5d577ea59a9b5f03fad9ebac668547-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal\" (UID: \"ca5d577ea59a9b5f03fad9ebac668547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.468264 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.468227 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.564814 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.564773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.568318 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.568305 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.570471 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.570459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.668741 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.668638 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.769280 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.769232 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-68.ec2.internal\" not found" Apr 16 22:04:49.812141 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.812111 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:49.823399 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.823372 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:49.881765 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.881740 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.889240 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.889218 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:04:49.891338 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.891316 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:04:49.891338 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.891339 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" Apr 16 22:04:49.891491 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.891439 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:49.891538 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.891481 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:49.908105 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.908072 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:04:49.961479 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.961392 2576 apiserver.go:52] "Watching apiserver" Apr 16 22:04:49.967020 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.966995 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:04:49.970075 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.970055 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b","openshift-dns/node-resolver-4llp9","openshift-network-operator/iptables-alerter-bdfv7","openshift-cluster-node-tuning-operator/tuned-xlzxx","openshift-image-registry/node-ca-tfhck","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal","openshift-multus/multus-additional-cni-plugins-2gb76","openshift-multus/multus-vlrfv","openshift-multus/network-metrics-daemon-hzjxc","openshift-network-diagnostics/network-check-target-sc5nk","openshift-ovn-kubernetes/ovnkube-node-25llf","kube-system/konnectivity-agent-bdwfp","kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal"] Apr 16 22:04:49.973222 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.973202 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.975223 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.975203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:49.975492 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.975474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:04:49.975585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.975557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bmw9x\"" Apr 16 22:04:49.975630 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.975594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.975787 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.975770 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.977200 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.977182 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d7w2z\"" Apr 16 22:04:49.977435 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.977279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.977435 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.977286 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.977435 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.977312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:49.979204 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.979187 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.979298 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.979279 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ckwwq\"" Apr 16 22:04:49.979298 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.979289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:04:49.979471 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.979307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.979471 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.979356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.980972 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.980957 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:04:49.981273 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.981259 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.981376 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.981261 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zt5gz\"" Apr 16 22:04:49.981438 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.981418 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.981582 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.981557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:49.983599 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.983580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bx2nq\"" Apr 16 22:04:49.983765 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.983619 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.983765 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.983662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.983765 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.983727 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:04:49.983765 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.983729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.985932 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.985684 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9vld4\"" Apr 16 22:04:49.985932 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.985687 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.985932 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.985803 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:04:49.985932 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.985925 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:04:49.986168 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.986125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.986168 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.986157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlrfv" Apr 16 22:04:49.986263 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.986199 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:04:49.987892 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.987876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:04:49.988057 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mrs\" (UniqueName: \"kubernetes.io/projected/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-kube-api-access-22mrs\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:49.988133 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-run\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.988133 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-lib-modules\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.988133 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcp5k\" (UniqueName: \"kubernetes.io/projected/987d2c5b-b0f1-4de0-a04a-f379a59db707-kube-api-access-rcp5k\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.988256 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-system-cni-dir\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ldz7x\"" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-modprobe-d\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysconfig\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-os-release\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cni-binary-copy\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-registration-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysctl-d\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-systemd\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03c81f44-bba2-4d54-b6db-157f9d7e76c7-host\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqxx\" (UniqueName: \"kubernetes.io/projected/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-kube-api-access-4cqxx\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysctl-conf\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-hosts-file\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.988996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-var-lib-kubelet\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-host\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.989388 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989083 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-etc-selinux\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-sys-fs\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e68d4b37-e705-448a-84ba-0da25ef12585-host-slash\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cnibin\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-socket-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5xg\" (UniqueName: \"kubernetes.io/projected/e68d4b37-e705-448a-84ba-0da25ef12585-kube-api-access-mg5xg\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-kubernetes\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63130b35-d6a9-4017-a7a1-066c8921674f-etc-tuned\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63130b35-d6a9-4017-a7a1-066c8921674f-tmp\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03c81f44-bba2-4d54-b6db-157f9d7e76c7-serviceca\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-sys\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ts85\" (UniqueName: \"kubernetes.io/projected/63130b35-d6a9-4017-a7a1-066c8921674f-kube-api-access-7ts85\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgts\" (UniqueName: \"kubernetes.io/projected/03c81f44-bba2-4d54-b6db-157f9d7e76c7-kube-api-access-nxgts\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-device-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-tmp-dir\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:49.991030 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e68d4b37-e705-448a-84ba-0da25ef12585-iptables-alerter-script\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:49.991654 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.989798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:49.991654 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.990870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:49.991654 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.990939 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:04:49.992953 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.992931 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:04:49.993045 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.993007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:49.993105 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:49.993069 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:04:49.995331 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.995311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:49.996253 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.996229 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 21:59:48 +0000 UTC" deadline="2027-09-17 00:23:31.645849258 +0000 UTC" Apr 16 22:04:49.996253 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.996251 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12434h18m41.649599533s" Apr 16 22:04:49.997352 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.997334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:49.997433 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.997407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kgszc\"" Apr 16 22:04:49.997724 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.997704 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:04:49.997866 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.997849 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:04:49.997924 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.997912 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:04:49.998004 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.997989 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:04:49.998230 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.998216 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:04:49.998276 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.998222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:04:49.999742 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.999724 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:04:49.999927 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.999906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zjszl\"" Apr 16 22:04:49.999985 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:49.999965 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:04:50.010568 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.010545 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ww8v6" Apr 16 22:04:50.017416 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.017398 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ww8v6" Apr 16 22:04:50.083825 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.083802 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:04:50.089937 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.089910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5xg\" (UniqueName: \"kubernetes.io/projected/e68d4b37-e705-448a-84ba-0da25ef12585-kube-api-access-mg5xg\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.089941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-kubernetes\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.089960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63130b35-d6a9-4017-a7a1-066c8921674f-etc-tuned\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.089997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-etc-kubernetes\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-sys\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-device-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-kubernetes\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-cni-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-daemon-config\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-sys\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95d30843-3d5e-42ad-94ae-c9a2c65d3e0a-agent-certs\") pod \"konnectivity-agent-bdwfp\" (UID: \"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a\") " pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e68d4b37-e705-448a-84ba-0da25ef12585-iptables-alerter-script\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-device-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-cni-multus\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090389 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-run-netns\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-run\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090418 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-cni-bin\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-multus-certs\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090501 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-systemd-units\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-run\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-var-lib-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-cni-netd\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090593 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-k8s-cni-cncf-io\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-hostroot\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-systemd\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-host\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-host\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.090860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-etc-selinux\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-systemd\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-cnibin\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-conf-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-etc-selinux\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95d30843-3d5e-42ad-94ae-c9a2c65d3e0a-konnectivity-ca\") pod \"konnectivity-agent-bdwfp\" (UID: \"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a\") " pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8tr\" (UniqueName: \"kubernetes.io/projected/d7498930-9a40-4a06-a45f-79c56cdfd2e3-kube-api-access-zr8tr\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-socket-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.090974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-system-cni-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63130b35-d6a9-4017-a7a1-066c8921674f-tmp\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03c81f44-bba2-4d54-b6db-157f9d7e76c7-serviceca\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-os-release\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-socket-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovnkube-config\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ts85\" (UniqueName: \"kubernetes.io/projected/63130b35-d6a9-4017-a7a1-066c8921674f-kube-api-access-7ts85\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.091643 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgts\" (UniqueName: \"kubernetes.io/projected/03c81f44-bba2-4d54-b6db-157f9d7e76c7-kube-api-access-nxgts\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-tmp-dir\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-kubelet\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e68d4b37-e705-448a-84ba-0da25ef12585-iptables-alerter-script\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-systemd\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-etc-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22mrs\" (UniqueName: \"kubernetes.io/projected/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-kube-api-access-22mrs\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-lib-modules\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcp5k\" (UniqueName: \"kubernetes.io/projected/987d2c5b-b0f1-4de0-a04a-f379a59db707-kube-api-access-rcp5k\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6lgr\" (UniqueName: \"kubernetes.io/projected/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-kube-api-access-h6lgr\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxttt\" (UniqueName: \"kubernetes.io/projected/6690fd79-9fd1-41a1-acf7-d29fd96d4757-kube-api-access-qxttt\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-log-socket\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03c81f44-bba2-4d54-b6db-157f9d7e76c7-serviceca\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovnkube-script-lib\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-system-cni-dir\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.092432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-modprobe-d\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysconfig\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-lib-modules\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-os-release\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cni-binary-copy\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-registration-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-tmp-dir\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-system-cni-dir\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-slash\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysconfig\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-os-release\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysctl-d\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03c81f44-bba2-4d54-b6db-157f9d7e76c7-host\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.091999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-modprobe-d\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqxx\" (UniqueName: \"kubernetes.io/projected/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-kube-api-access-4cqxx\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093205 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-registration-dir\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-cni-binary-copy\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-netns\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-kubelet\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysctl-d\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-ovn\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovn-node-metrics-cert\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysctl-conf\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03c81f44-bba2-4d54-b6db-157f9d7e76c7-host\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-hosts-file\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-etc-sysctl-conf\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-hosts-file\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-var-lib-kubelet\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cni-binary-copy\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-sys-fs\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-node-log\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.093889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-cni-bin\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/987d2c5b-b0f1-4de0-a04a-f379a59db707-sys-fs\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-env-overrides\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e68d4b37-e705-448a-84ba-0da25ef12585-host-slash\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cnibin\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e68d4b37-e705-448a-84ba-0da25ef12585-host-slash\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-cnibin\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-socket-dir-parent\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.092939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63130b35-d6a9-4017-a7a1-066c8921674f-var-lib-kubelet\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.094381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.093971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63130b35-d6a9-4017-a7a1-066c8921674f-etc-tuned\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.094833 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.094814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63130b35-d6a9-4017-a7a1-066c8921674f-tmp\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.097596 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.097385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5xg\" (UniqueName: \"kubernetes.io/projected/e68d4b37-e705-448a-84ba-0da25ef12585-kube-api-access-mg5xg\") pod \"iptables-alerter-bdfv7\" (UID: \"e68d4b37-e705-448a-84ba-0da25ef12585\") " pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.098757 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.098732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mrs\" (UniqueName: \"kubernetes.io/projected/e04f9b26-0017-48cc-a5f0-a9c2bae5d9df-kube-api-access-22mrs\") pod \"node-resolver-4llp9\" (UID: \"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df\") " pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.098850 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.098740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgts\" (UniqueName: \"kubernetes.io/projected/03c81f44-bba2-4d54-b6db-157f9d7e76c7-kube-api-access-nxgts\") pod \"node-ca-tfhck\" (UID: \"03c81f44-bba2-4d54-b6db-157f9d7e76c7\") " pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.099156 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.099136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ts85\" (UniqueName: \"kubernetes.io/projected/63130b35-d6a9-4017-a7a1-066c8921674f-kube-api-access-7ts85\") pod \"tuned-xlzxx\" (UID: \"63130b35-d6a9-4017-a7a1-066c8921674f\") " pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.099634 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.099615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqxx\" (UniqueName: \"kubernetes.io/projected/995fbb71-6c0e-4689-8c49-6fd0c1a79f15-kube-api-access-4cqxx\") pod \"multus-additional-cni-plugins-2gb76\" (UID: \"995fbb71-6c0e-4689-8c49-6fd0c1a79f15\") " pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.099753 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.099736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcp5k\" (UniqueName: \"kubernetes.io/projected/987d2c5b-b0f1-4de0-a04a-f379a59db707-kube-api-access-rcp5k\") pod \"aws-ebs-csi-driver-node-xfb8b\" (UID: \"987d2c5b-b0f1-4de0-a04a-f379a59db707\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.190442 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.190403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67efa12b1e612144155ca84bcb8df9e9.slice/crio-a65a7918dec23edc2851ffac0ae02b3e006ed18397f8f44c42628b0fc35c4cfd WatchSource:0}: Error finding container a65a7918dec23edc2851ffac0ae02b3e006ed18397f8f44c42628b0fc35c4cfd: Status 404 returned error can't find the container with id a65a7918dec23edc2851ffac0ae02b3e006ed18397f8f44c42628b0fc35c4cfd Apr 16 22:04:50.192543 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.192516 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d577ea59a9b5f03fad9ebac668547.slice/crio-d04a9ed96a139635195b63e3583fc3b9e35a8730ef83f4d13919327132800ea9 WatchSource:0}: Error finding container d04a9ed96a139635195b63e3583fc3b9e35a8730ef83f4d13919327132800ea9: Status 404 returned error can't find the container with id d04a9ed96a139635195b63e3583fc3b9e35a8730ef83f4d13919327132800ea9 Apr 16 22:04:50.193726 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.193805 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193759 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.193805 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-cnibin\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.193870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-conf-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.193870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95d30843-3d5e-42ad-94ae-c9a2c65d3e0a-konnectivity-ca\") pod \"konnectivity-agent-bdwfp\" (UID: \"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a\") " pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.193870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-conf-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.193870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-cnibin\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.193870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8tr\" (UniqueName: \"kubernetes.io/projected/d7498930-9a40-4a06-a45f-79c56cdfd2e3-kube-api-access-zr8tr\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-system-cni-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-os-release\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.193974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovnkube-config\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-kubelet\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-system-cni-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.194035 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-systemd\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-etc-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-kubelet\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6lgr\" (UniqueName: \"kubernetes.io/projected/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-kube-api-access-h6lgr\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-systemd\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-etc-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxttt\" (UniqueName: \"kubernetes.io/projected/6690fd79-9fd1-41a1-acf7-d29fd96d4757-kube-api-access-qxttt\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-os-release\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-log-socket\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovnkube-script-lib\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-slash\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-cni-binary-copy\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.194356 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95d30843-3d5e-42ad-94ae-c9a2c65d3e0a-konnectivity-ca\") pod \"konnectivity-agent-bdwfp\" (UID: \"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a\") " pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-log-socket\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-slash\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-netns\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-netns\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovnkube-config\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-kubelet\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-ovn\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovn-node-metrics-cert\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-kubelet\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-node-log\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-cni-bin\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-run-ovn\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-env-overrides\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-node-log\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-socket-dir-parent\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovnkube-script-lib\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-etc-kubernetes\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-cni-bin\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-cni-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-etc-kubernetes\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-socket-dir-parent\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-daemon-config\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-cni-dir\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-cni-binary-copy\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95d30843-3d5e-42ad-94ae-c9a2c65d3e0a-agent-certs\") pod \"konnectivity-agent-bdwfp\" (UID: \"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a\") " pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.194947 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.194963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-cni-multus\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.195081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:50.694998952 +0000 UTC m=+2.168377703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-run-netns\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7498930-9a40-4a06-a45f-79c56cdfd2e3-env-overrides\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-cni-bin\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-cni-bin\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.195841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-multus-daemon-config\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-multus-certs\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-run-netns\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-systemd-units\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-var-lib-cni-multus\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-multus-certs\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-var-lib-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-systemd-units\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-cni-netd\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-var-lib-openvswitch\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-k8s-cni-cncf-io\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-hostroot\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-host-run-k8s-cni-cncf-io\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-hostroot\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.196420 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.195538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d7498930-9a40-4a06-a45f-79c56cdfd2e3-host-cni-netd\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.197145 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.197129 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:04:50.197343 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.197323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7498930-9a40-4a06-a45f-79c56cdfd2e3-ovn-node-metrics-cert\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.197449 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.197432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95d30843-3d5e-42ad-94ae-c9a2c65d3e0a-agent-certs\") pod \"konnectivity-agent-bdwfp\" (UID: \"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a\") " pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.202120 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.202104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8tr\" (UniqueName: \"kubernetes.io/projected/d7498930-9a40-4a06-a45f-79c56cdfd2e3-kube-api-access-zr8tr\") pod \"ovnkube-node-25llf\" (UID: \"d7498930-9a40-4a06-a45f-79c56cdfd2e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.203499 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.203478 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:50.203499 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.203502 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:50.203661 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.203516 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:50.203661 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.203611 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:50.703592389 +0000 UTC m=+2.176971151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:50.205229 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.205206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6lgr\" (UniqueName: \"kubernetes.io/projected/482f4fcf-1af7-4c0a-a8d2-c059af41fba7-kube-api-access-h6lgr\") pod \"multus-vlrfv\" (UID: \"482f4fcf-1af7-4c0a-a8d2-c059af41fba7\") " pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.205476 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.205461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxttt\" (UniqueName: \"kubernetes.io/projected/6690fd79-9fd1-41a1-acf7-d29fd96d4757-kube-api-access-qxttt\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:50.302472 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.302369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" Apr 16 22:04:50.308363 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.308333 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987d2c5b_b0f1_4de0_a04a_f379a59db707.slice/crio-8fe73adfc3476bd25897fc74f0dd4c4ec2c5d4d4a55c1a498bf8ab5af2efc7e4 WatchSource:0}: Error finding container 8fe73adfc3476bd25897fc74f0dd4c4ec2c5d4d4a55c1a498bf8ab5af2efc7e4: Status 404 returned error can't find the container with id 8fe73adfc3476bd25897fc74f0dd4c4ec2c5d4d4a55c1a498bf8ab5af2efc7e4 Apr 16 22:04:50.322585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.322562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4llp9" Apr 16 22:04:50.329078 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.329052 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04f9b26_0017_48cc_a5f0_a9c2bae5d9df.slice/crio-3295e5b038e2222b1908fe5e880d9854b0320c0703138f6df70cd5802d919639 WatchSource:0}: Error finding container 3295e5b038e2222b1908fe5e880d9854b0320c0703138f6df70cd5802d919639: Status 404 returned error can't find the container with id 3295e5b038e2222b1908fe5e880d9854b0320c0703138f6df70cd5802d919639 Apr 16 22:04:50.345449 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.345424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bdfv7" Apr 16 22:04:50.351186 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.351161 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68d4b37_e705_448a_84ba_0da25ef12585.slice/crio-76aa87a2701cc4c1a5904e4556119074f7efa7c2fce90e917db2ed330e276917 WatchSource:0}: Error finding container 76aa87a2701cc4c1a5904e4556119074f7efa7c2fce90e917db2ed330e276917: Status 404 returned error can't find the container with id 76aa87a2701cc4c1a5904e4556119074f7efa7c2fce90e917db2ed330e276917 Apr 16 22:04:50.361802 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.361782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" Apr 16 22:04:50.367681 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.367657 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63130b35_d6a9_4017_a7a1_066c8921674f.slice/crio-8e4d31c6c2375fe445880d2618e7c034c0014bb07f203ff2ab49b00a040eb612 WatchSource:0}: Error finding container 8e4d31c6c2375fe445880d2618e7c034c0014bb07f203ff2ab49b00a040eb612: Status 404 returned error can't find the container with id 8e4d31c6c2375fe445880d2618e7c034c0014bb07f203ff2ab49b00a040eb612 Apr 16 22:04:50.376058 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.376041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tfhck" Apr 16 22:04:50.378870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.378850 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:50.382053 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.382031 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c81f44_bba2_4d54_b6db_157f9d7e76c7.slice/crio-8eca12ebd1c407b0a9e30b369aed4eba505b42378b33da70a5a788ec002c75ce WatchSource:0}: Error finding container 8eca12ebd1c407b0a9e30b369aed4eba505b42378b33da70a5a788ec002c75ce: Status 404 returned error can't find the container with id 8eca12ebd1c407b0a9e30b369aed4eba505b42378b33da70a5a788ec002c75ce Apr 16 22:04:50.392948 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.392929 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2gb76" Apr 16 22:04:50.419060 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.419038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlrfv" Apr 16 22:04:50.424552 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.424530 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482f4fcf_1af7_4c0a_a8d2_c059af41fba7.slice/crio-fa79566c274bce32f18ecb28d77072b5fca09c7b1e69af86a4067cc3854947cc WatchSource:0}: Error finding container fa79566c274bce32f18ecb28d77072b5fca09c7b1e69af86a4067cc3854947cc: Status 404 returned error can't find the container with id fa79566c274bce32f18ecb28d77072b5fca09c7b1e69af86a4067cc3854947cc Apr 16 22:04:50.433751 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.433730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:04:50.437292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.437273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:04:50.439274 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.439252 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7498930_9a40_4a06_a45f_79c56cdfd2e3.slice/crio-04029e7b8514745d3bf03288bdce136c935f89af61628bddc190733eb2a6df4d WatchSource:0}: Error finding container 04029e7b8514745d3bf03288bdce136c935f89af61628bddc190733eb2a6df4d: Status 404 returned error can't find the container with id 04029e7b8514745d3bf03288bdce136c935f89af61628bddc190733eb2a6df4d Apr 16 22:04:50.443795 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:04:50.443771 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d30843_3d5e_42ad_94ae_c9a2c65d3e0a.slice/crio-ba74ff199f0fb172b84e01da16c042bc88d14a4053737cab3d84f043bcdf18ef WatchSource:0}: Error finding container ba74ff199f0fb172b84e01da16c042bc88d14a4053737cab3d84f043bcdf18ef: Status 404 returned error can't find the container with id ba74ff199f0fb172b84e01da16c042bc88d14a4053737cab3d84f043bcdf18ef Apr 16 22:04:50.700340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.700262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:50.700541 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.700424 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:50.700541 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.700501 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:51.700480097 +0000 UTC m=+3.173858862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:50.801991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:50.801956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:50.802158 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.802115 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:50.802158 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.802134 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:50.802158 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.802146 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:50.802307 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:50.802205 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:51.802185704 +0000 UTC m=+3.275564457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:51.007487 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.007235 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:51.022155 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.018539 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 21:59:50 +0000 UTC" deadline="2027-09-14 21:15:45.873571317 +0000 UTC" Apr 16 22:04:51.022155 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.018572 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12383h10m54.855003227s" Apr 16 22:04:51.164748 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.161511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"04029e7b8514745d3bf03288bdce136c935f89af61628bddc190733eb2a6df4d"} Apr 16 22:04:51.164748 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.162925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlrfv" event={"ID":"482f4fcf-1af7-4c0a-a8d2-c059af41fba7","Type":"ContainerStarted","Data":"fa79566c274bce32f18ecb28d77072b5fca09c7b1e69af86a4067cc3854947cc"} Apr 16 22:04:51.165271 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.165245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4llp9" event={"ID":"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df","Type":"ContainerStarted","Data":"3295e5b038e2222b1908fe5e880d9854b0320c0703138f6df70cd5802d919639"} Apr 16 22:04:51.176751 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.175259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" event={"ID":"67efa12b1e612144155ca84bcb8df9e9","Type":"ContainerStarted","Data":"a65a7918dec23edc2851ffac0ae02b3e006ed18397f8f44c42628b0fc35c4cfd"} Apr 16 22:04:51.182670 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.182639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerStarted","Data":"f24aaa9483302bdc24c40e2c0999868d582420d0c9e9a66106fe8050b5a794ed"} Apr 16 22:04:51.197482 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.197440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tfhck" event={"ID":"03c81f44-bba2-4d54-b6db-157f9d7e76c7","Type":"ContainerStarted","Data":"8eca12ebd1c407b0a9e30b369aed4eba505b42378b33da70a5a788ec002c75ce"} Apr 16 22:04:51.215990 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.215851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" event={"ID":"63130b35-d6a9-4017-a7a1-066c8921674f","Type":"ContainerStarted","Data":"8e4d31c6c2375fe445880d2618e7c034c0014bb07f203ff2ab49b00a040eb612"} Apr 16 22:04:51.234708 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.234519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bdfv7" event={"ID":"e68d4b37-e705-448a-84ba-0da25ef12585","Type":"ContainerStarted","Data":"76aa87a2701cc4c1a5904e4556119074f7efa7c2fce90e917db2ed330e276917"} Apr 16 22:04:51.259767 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.258959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" event={"ID":"987d2c5b-b0f1-4de0-a04a-f379a59db707","Type":"ContainerStarted","Data":"8fe73adfc3476bd25897fc74f0dd4c4ec2c5d4d4a55c1a498bf8ab5af2efc7e4"} Apr 16 22:04:51.267321 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.267281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" event={"ID":"ca5d577ea59a9b5f03fad9ebac668547","Type":"ContainerStarted","Data":"d04a9ed96a139635195b63e3583fc3b9e35a8730ef83f4d13919327132800ea9"} Apr 16 22:04:51.280053 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.279966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bdwfp" event={"ID":"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a","Type":"ContainerStarted","Data":"ba74ff199f0fb172b84e01da16c042bc88d14a4053737cab3d84f043bcdf18ef"} Apr 16 22:04:51.710465 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.710355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:51.710635 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:51.710500 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:51.710635 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:51.710563 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:53.710543967 +0000 UTC m=+5.183922722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:51.810888 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:51.810851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:51.811143 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:51.811018 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:51.811143 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:51.811042 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:51.811143 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:51.811054 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:51.811143 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:51.811111 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:53.811093097 +0000 UTC m=+5.284471868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:52.019305 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:52.019204 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 21:59:50 +0000 UTC" deadline="2027-10-13 06:05:02.517582646 +0000 UTC" Apr 16 22:04:52.019305 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:52.019247 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13064h0m10.498340238s" Apr 16 22:04:52.133244 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:52.133211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:52.133435 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:52.133344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:04:52.133648 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:52.133626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:52.133793 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:52.133771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:04:52.564430 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:52.564398 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:53.726503 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:53.726467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:53.726970 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:53.726622 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:53.726970 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:53.726688 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.726669794 +0000 UTC m=+9.200048546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:53.827213 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:53.827113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:53.827395 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:53.827259 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:53.827395 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:53.827279 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:53.827395 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:53.827291 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:53.827395 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:53.827351 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.82733282 +0000 UTC m=+9.300711575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:54.133951 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:54.133481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:54.133951 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:54.133618 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:04:54.133951 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:54.133653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:54.133951 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:54.133784 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:04:55.177896 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.177783 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-c2gt5"] Apr 16 22:04:55.180994 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.180966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.181237 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:55.181146 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:04:55.239817 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.239561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d899808a-e158-4915-b6c2-f135d5b829ef-kubelet-config\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.239817 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.239621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d899808a-e158-4915-b6c2-f135d5b829ef-dbus\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.239817 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.239758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.340267 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.340224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.340458 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.340279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d899808a-e158-4915-b6c2-f135d5b829ef-kubelet-config\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.340458 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.340403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d899808a-e158-4915-b6c2-f135d5b829ef-dbus\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.340458 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:55.340419 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:55.340601 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:55.340497 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret podName:d899808a-e158-4915-b6c2-f135d5b829ef nodeName:}" failed. No retries permitted until 2026-04-16 22:04:55.840476175 +0000 UTC m=+7.313854933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret") pod "global-pull-secret-syncer-c2gt5" (UID: "d899808a-e158-4915-b6c2-f135d5b829ef") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:55.340601 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.340525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d899808a-e158-4915-b6c2-f135d5b829ef-dbus\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.340601 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.340586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d899808a-e158-4915-b6c2-f135d5b829ef-kubelet-config\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.845037 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:55.844958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:55.845234 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:55.845091 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:55.845234 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:55.845168 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret podName:d899808a-e158-4915-b6c2-f135d5b829ef nodeName:}" failed. No retries permitted until 2026-04-16 22:04:56.84514792 +0000 UTC m=+8.318526674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret") pod "global-pull-secret-syncer-c2gt5" (UID: "d899808a-e158-4915-b6c2-f135d5b829ef") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:56.132870 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:56.132792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:56.133057 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:56.132923 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:04:56.133057 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:56.132974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:56.133057 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:56.133062 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:04:56.852296 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:56.852258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:56.852724 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:56.852429 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:56.852724 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:56.852487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret podName:d899808a-e158-4915-b6c2-f135d5b829ef nodeName:}" failed. No retries permitted until 2026-04-16 22:04:58.852469526 +0000 UTC m=+10.325848276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret") pod "global-pull-secret-syncer-c2gt5" (UID: "d899808a-e158-4915-b6c2-f135d5b829ef") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:57.132968 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:57.132862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:57.133128 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.132986 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:04:57.759708 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:57.759652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:57.759896 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.759806 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.759896 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.759876 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:05.759857358 +0000 UTC m=+17.233236111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.860646 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:57.860562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:57.861109 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.860756 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:57.861109 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.860787 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:57.861109 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.860800 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.861109 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:57.860868 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:05.860846962 +0000 UTC m=+17.334225724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:58.133135 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:58.133050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:04:58.133285 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:58.133189 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:04:58.133285 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:58.133198 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:04:58.133400 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:58.133296 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:04:58.871140 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:58.870505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:58.871140 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:58.870677 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:58.871140 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:58.870764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret podName:d899808a-e158-4915-b6c2-f135d5b829ef nodeName:}" failed. No retries permitted until 2026-04-16 22:05:02.870745882 +0000 UTC m=+14.344124634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret") pod "global-pull-secret-syncer-c2gt5" (UID: "d899808a-e158-4915-b6c2-f135d5b829ef") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:04:59.134106 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:04:59.133982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:04:59.134266 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:04:59.134106 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:00.133498 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:00.133461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:00.133951 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:00.133461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:00.133951 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:00.133612 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:00.133951 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:00.133666 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:01.133425 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:01.133390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:01.133606 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:01.133519 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:02.133215 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:02.133169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:02.133215 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:02.133187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:02.133447 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:02.133288 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:02.133447 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:02.133387 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:02.901996 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:02.901957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:02.902384 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:02.902121 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:02.902384 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:02.902192 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret podName:d899808a-e158-4915-b6c2-f135d5b829ef nodeName:}" failed. No retries permitted until 2026-04-16 22:05:10.902177143 +0000 UTC m=+22.375555907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret") pod "global-pull-secret-syncer-c2gt5" (UID: "d899808a-e158-4915-b6c2-f135d5b829ef") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:03.133337 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:03.133303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:03.133520 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:03.133441 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:04.132898 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:04.132807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:04.133387 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:04.132808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:04.133387 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:04.132950 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:04.133387 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:04.133011 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:05.136208 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:05.136177 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:05.136617 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.136308 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:05.822578 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:05.822540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:05.822878 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.822667 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:05.822878 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.822764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:21.822743011 +0000 UTC m=+33.296121774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:05.923077 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:05.923037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:05.923262 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.923178 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:05.923262 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.923199 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:05.923262 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.923212 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:05.923413 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:05.923274 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:21.923253889 +0000 UTC m=+33.396632636 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:06.133129 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:06.133046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:06.133290 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:06.133047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:06.133290 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:06.133197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:06.133290 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:06.133251 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:07.132632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:07.132601 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:07.133065 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:07.132732 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:08.133130 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:08.133105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:08.133488 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:08.133106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:08.133488 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:08.133205 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:08.133488 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:08.133271 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:09.133547 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.133396 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:09.133972 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:09.133612 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:09.321446 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.321246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" event={"ID":"63130b35-d6a9-4017-a7a1-066c8921674f","Type":"ContainerStarted","Data":"b8e482a672743b8fe89c2c103d0998c721ddcaa73032105adf71021bf3f827d6"} Apr 16 22:05:09.323834 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.323771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:05:09.324207 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324183 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7498930-9a40-4a06-a45f-79c56cdfd2e3" containerID="255ad5cec845843c65e1120cd5aac7cca9f9e81232b6ea779790ca412e0ead50" exitCode=1 Apr 16 22:05:09.324370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"0c756e8c7bb92a049d3119a741ff87e84961d8f8b6aa493889a07cf4f63be801"} Apr 16 22:05:09.324370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"779bfb4b2190e9df43b0adc9243d4570b3d344ab81b6f217f65181bcb020afa8"} Apr 16 22:05:09.324370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"cff7426490922a037d7832d674742c929190a2f78f701951fb4d8bbc15dff957"} Apr 16 22:05:09.324370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"c6f62126f4b912eca888bc7cd6fbfc7bbe09c45befbcc2c676025512b0ccfdc9"} Apr 16 22:05:09.324370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerDied","Data":"255ad5cec845843c65e1120cd5aac7cca9f9e81232b6ea779790ca412e0ead50"} Apr 16 22:05:09.324370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.324303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"cfe5acb978e33bf87a5e3b29b4038fa67780dfa8537e6ec4a637481f478237e1"} Apr 16 22:05:09.325506 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.325479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlrfv" event={"ID":"482f4fcf-1af7-4c0a-a8d2-c059af41fba7","Type":"ContainerStarted","Data":"3089c300b77de471cbc67c5bc260286ed561e8010894e7f4efd72b98a6017b43"} Apr 16 22:05:09.326586 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.326569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" event={"ID":"67efa12b1e612144155ca84bcb8df9e9","Type":"ContainerStarted","Data":"1aad7ddfc07ce158e1107598b9829000ee5d598301b7080722ff0edb1ef5a784"} Apr 16 22:05:09.334218 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.334181 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xlzxx" podStartSLOduration=2.534187071 podStartE2EDuration="20.334168547s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.369232134 +0000 UTC m=+1.842610882" lastFinishedPulling="2026-04-16 22:05:08.169213607 +0000 UTC m=+19.642592358" observedRunningTime="2026-04-16 22:05:09.333907315 +0000 UTC m=+20.807286085" watchObservedRunningTime="2026-04-16 22:05:09.334168547 +0000 UTC m=+20.807547319" Apr 16 22:05:09.344983 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.344900 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-68.ec2.internal" podStartSLOduration=20.344889988 podStartE2EDuration="20.344889988s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:05:09.344723935 +0000 UTC m=+20.818102702" watchObservedRunningTime="2026-04-16 22:05:09.344889988 +0000 UTC m=+20.818268830" Apr 16 22:05:09.359096 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:09.359063 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vlrfv" podStartSLOduration=2.596772045 podStartE2EDuration="20.359052533s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.426009896 +0000 UTC m=+1.899388645" lastFinishedPulling="2026-04-16 22:05:08.188290382 +0000 UTC m=+19.661669133" observedRunningTime="2026-04-16 22:05:09.358711209 +0000 UTC m=+20.832089972" watchObservedRunningTime="2026-04-16 22:05:09.359052533 +0000 UTC m=+20.832431302" Apr 16 22:05:10.133424 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:10.133393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:10.133633 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:10.133403 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:10.133633 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:10.133499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:10.133633 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:10.133570 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:10.329712 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:10.329656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bdfv7" event={"ID":"e68d4b37-e705-448a-84ba-0da25ef12585","Type":"ContainerStarted","Data":"b650f3332f71e5fe2ab2f4c3244e708531793a3a9a49898a11a9efedcd9e21de"} Apr 16 22:05:10.963231 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:10.963197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:10.963407 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:10.963338 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:10.963407 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:10.963400 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret podName:d899808a-e158-4915-b6c2-f135d5b829ef nodeName:}" failed. No retries permitted until 2026-04-16 22:05:26.963385066 +0000 UTC m=+38.436763820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret") pod "global-pull-secret-syncer-c2gt5" (UID: "d899808a-e158-4915-b6c2-f135d5b829ef") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:05:11.133293 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:11.133257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:11.133467 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:11.133384 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:11.335718 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:11.335681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:05:11.336154 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:11.336126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"0d93b4842e9a910b3efecf03c1caf17744e31eca0aacef43e2f2ed78ff2bef4f"} Apr 16 22:05:12.132815 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.132790 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:12.132942 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.132793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:12.132942 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:12.132896 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:12.133014 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:12.132962 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:12.338760 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.338724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4llp9" event={"ID":"e04f9b26-0017-48cc-a5f0-a9c2bae5d9df","Type":"ContainerStarted","Data":"ed7b2108130aaa12bf3e33177e83f74dc341eb03ba8ab88b81a703bd91f6549e"} Apr 16 22:05:12.340051 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.340026 2576 generic.go:358] "Generic (PLEG): container finished" podID="995fbb71-6c0e-4689-8c49-6fd0c1a79f15" containerID="518b971a44e3f18cac5b0636c1fea4bd2777f902f8ce9b7748b4314899829249" exitCode=0 Apr 16 22:05:12.340174 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.340089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerDied","Data":"518b971a44e3f18cac5b0636c1fea4bd2777f902f8ce9b7748b4314899829249"} Apr 16 22:05:12.341316 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.341172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tfhck" event={"ID":"03c81f44-bba2-4d54-b6db-157f9d7e76c7","Type":"ContainerStarted","Data":"b03ccaf7290d6803a00ecfba9c8dc50ca080ddadee91b24eda9090ded09e4285"} Apr 16 22:05:12.342493 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.342472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" event={"ID":"987d2c5b-b0f1-4de0-a04a-f379a59db707","Type":"ContainerStarted","Data":"783306e43a0f93a3f9dc0c52f6604799bcfcc0d7f9b19ed4d4c481baad1d0051"} Apr 16 22:05:12.343917 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.343771 2576 generic.go:358] "Generic (PLEG): container finished" podID="ca5d577ea59a9b5f03fad9ebac668547" containerID="139f53d63b59afde0e7068c2238181abb48973505b14c1225f3c892e7552f047" exitCode=0 Apr 16 22:05:12.343992 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.343802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" event={"ID":"ca5d577ea59a9b5f03fad9ebac668547","Type":"ContainerDied","Data":"139f53d63b59afde0e7068c2238181abb48973505b14c1225f3c892e7552f047"} Apr 16 22:05:12.345157 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.345138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bdwfp" event={"ID":"95d30843-3d5e-42ad-94ae-c9a2c65d3e0a","Type":"ContainerStarted","Data":"c47d0b14a25836573ed85245b78c27b84d1d38d2762ce4f9e179ca07c014a779"} Apr 16 22:05:12.361135 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.361100 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bdfv7" podStartSLOduration=5.573494 podStartE2EDuration="23.361090643s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.352606325 +0000 UTC m=+1.825985073" lastFinishedPulling="2026-04-16 22:05:08.140202953 +0000 UTC m=+19.613581716" observedRunningTime="2026-04-16 22:05:10.340262683 +0000 UTC m=+21.813641452" watchObservedRunningTime="2026-04-16 22:05:12.361090643 +0000 UTC m=+23.834469413" Apr 16 22:05:12.373081 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.373042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4llp9" podStartSLOduration=5.536505438 podStartE2EDuration="23.373029514s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.330595921 +0000 UTC m=+1.803974670" lastFinishedPulling="2026-04-16 22:05:08.167119998 +0000 UTC m=+19.640498746" observedRunningTime="2026-04-16 22:05:12.361052101 +0000 UTC m=+23.834430871" watchObservedRunningTime="2026-04-16 22:05:12.373029514 +0000 UTC m=+23.846408284" Apr 16 22:05:12.373638 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.373584 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bdwfp" podStartSLOduration=5.678832981 podStartE2EDuration="23.373575303s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.445283582 +0000 UTC m=+1.918662335" lastFinishedPulling="2026-04-16 22:05:08.140025892 +0000 UTC m=+19.613404657" observedRunningTime="2026-04-16 22:05:12.372919852 +0000 UTC m=+23.846298621" watchObservedRunningTime="2026-04-16 22:05:12.373575303 +0000 UTC m=+23.846954073" Apr 16 22:05:12.397846 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.397785 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tfhck" podStartSLOduration=5.641424367 podStartE2EDuration="23.397772467s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.383897657 +0000 UTC m=+1.857276406" lastFinishedPulling="2026-04-16 22:05:08.140245751 +0000 UTC m=+19.613624506" observedRunningTime="2026-04-16 22:05:12.397451614 +0000 UTC m=+23.870830385" watchObservedRunningTime="2026-04-16 22:05:12.397772467 +0000 UTC m=+23.871151261" Apr 16 22:05:12.977544 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:12.977518 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:05:13.052011 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.051924 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:05:12.977540285Z","UUID":"71b14c22-b4bf-412b-8d83-6afd719f05fa","Handler":null,"Name":"","Endpoint":""} Apr 16 22:05:13.054849 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.054829 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:05:13.054849 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.054855 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:05:13.133048 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.132950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:13.133197 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:13.133097 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:13.347645 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.347613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" event={"ID":"987d2c5b-b0f1-4de0-a04a-f379a59db707","Type":"ContainerStarted","Data":"53736f4caef57fb757d4e0ca664db49cadfc174941e9f5d7c7e3e9d00fc62779"} Apr 16 22:05:13.348973 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.348950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" event={"ID":"ca5d577ea59a9b5f03fad9ebac668547","Type":"ContainerStarted","Data":"530d6476f821f1e059830bd9de85463722a45172a13d43959a40fd9b460eca57"} Apr 16 22:05:13.362319 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:13.362268 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-68.ec2.internal" podStartSLOduration=24.362247743 podStartE2EDuration="24.362247743s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:05:13.361445627 +0000 UTC m=+24.834824400" watchObservedRunningTime="2026-04-16 22:05:13.362247743 +0000 UTC m=+24.835626517" Apr 16 22:05:14.132733 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.132534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:14.132886 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.132539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:14.132886 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:14.132809 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:14.132886 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:14.132867 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:14.354672 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.354585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:05:14.355242 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.354977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"686654bda839057c0833772f2a26d2c9c062f40b65bf98331256648f9d018110"} Apr 16 22:05:14.355298 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.355248 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:05:14.355298 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.355270 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:05:14.355445 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.355433 2576 scope.go:117] "RemoveContainer" containerID="255ad5cec845843c65e1120cd5aac7cca9f9e81232b6ea779790ca412e0ead50" Apr 16 22:05:14.357423 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.357309 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:05:14.357423 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.357348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" event={"ID":"987d2c5b-b0f1-4de0-a04a-f379a59db707","Type":"ContainerStarted","Data":"a2dd25d69af1abf3cf6270b3ad2f9754cc1d51636deffa5338811b3823ce7c62"} Apr 16 22:05:14.358164 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.358142 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:05:14.374216 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.374191 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:05:14.393373 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:14.393326 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xfb8b" podStartSLOduration=1.8129061530000001 podStartE2EDuration="25.393312928s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.309913733 +0000 UTC m=+1.783292481" lastFinishedPulling="2026-04-16 22:05:13.890320495 +0000 UTC m=+25.363699256" observedRunningTime="2026-04-16 22:05:14.393075248 +0000 UTC m=+25.866454020" watchObservedRunningTime="2026-04-16 22:05:14.393312928 +0000 UTC m=+25.866691699" Apr 16 22:05:15.133137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.133110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:15.133284 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:15.133219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:15.364917 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.364883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:05:15.365722 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.365423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" event={"ID":"d7498930-9a40-4a06-a45f-79c56cdfd2e3","Type":"ContainerStarted","Data":"bcbe213556ad93ef10a0a7d918f50cb0df57be59392cfc6e7f465855d18ea6ec"} Apr 16 22:05:15.367931 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.367907 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:05:15.371549 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.369962 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bdwfp" Apr 16 22:05:15.371549 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.370448 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c2gt5"] Apr 16 22:05:15.371714 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.371682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:05:15.371927 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.371854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:15.372091 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:15.372058 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:15.372857 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.372833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sc5nk"] Apr 16 22:05:15.372986 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.372970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:15.373148 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:15.373124 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:15.373636 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.373616 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hzjxc"] Apr 16 22:05:15.373892 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.373732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:15.373892 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:15.373825 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:15.390433 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.390354 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:05:15.414034 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:15.413979 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" podStartSLOduration=8.583234902000001 podStartE2EDuration="26.413960772s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.440713086 +0000 UTC m=+1.914091835" lastFinishedPulling="2026-04-16 22:05:08.271438942 +0000 UTC m=+19.744817705" observedRunningTime="2026-04-16 22:05:15.413458559 +0000 UTC m=+26.886837364" watchObservedRunningTime="2026-04-16 22:05:15.413960772 +0000 UTC m=+26.887339543" Apr 16 22:05:17.133319 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:17.133284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:17.133938 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:17.133289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:17.133938 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:17.133430 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:17.133938 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:17.133289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:17.133938 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:17.133479 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:17.133938 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:17.133563 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:18.372951 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:18.372749 2576 generic.go:358] "Generic (PLEG): container finished" podID="995fbb71-6c0e-4689-8c49-6fd0c1a79f15" containerID="e2560c0cf0c70c55e87cf6289f862440d359597a626bd04c7f4b4d059506c3b2" exitCode=0 Apr 16 22:05:18.372951 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:18.372816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerDied","Data":"e2560c0cf0c70c55e87cf6289f862440d359597a626bd04c7f4b4d059506c3b2"} Apr 16 22:05:19.134026 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:19.133999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:19.134196 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:19.134035 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:19.134196 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:19.134115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:19.134196 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:19.134139 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:19.134306 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:19.134193 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:19.134306 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:19.134254 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:20.378786 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:20.378752 2576 generic.go:358] "Generic (PLEG): container finished" podID="995fbb71-6c0e-4689-8c49-6fd0c1a79f15" containerID="97cafbd237b8d01cb4dd7f27d1f6f6e52b327e2979cae131ffa81c6dd3d12bef" exitCode=0 Apr 16 22:05:20.379456 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:20.378802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerDied","Data":"97cafbd237b8d01cb4dd7f27d1f6f6e52b327e2979cae131ffa81c6dd3d12bef"} Apr 16 22:05:21.132793 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.132602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:21.132953 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.132608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:21.132953 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.132871 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sc5nk" podUID="ffbd2631-70f8-45c8-83f0-5e65052e0964" Apr 16 22:05:21.132953 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.132608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:21.133072 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.132954 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-c2gt5" podUID="d899808a-e158-4915-b6c2-f135d5b829ef" Apr 16 22:05:21.133072 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.133042 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:05:21.846610 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.846583 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-68.ec2.internal" event="NodeReady" Apr 16 22:05:21.846965 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.846734 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:05:21.855894 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.855863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:21.856015 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.856001 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:21.856067 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.856057 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:53.856039246 +0000 UTC m=+65.329418007 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:21.876347 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.876322 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s42mm"] Apr 16 22:05:21.879130 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.879027 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:21.880151 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.879953 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd"] Apr 16 22:05:21.881461 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.881440 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 22:05:21.881568 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.881494 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mvhtd\"" Apr 16 22:05:21.881858 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.881835 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 22:05:21.882808 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.882790 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4"] Apr 16 22:05:21.882946 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.882930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:21.885050 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885029 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:05:21.885050 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885045 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 22:05:21.885243 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885073 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:05:21.885243 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:05:21.885243 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885197 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 22:05:21.885373 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 22:05:21.885526 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885509 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 22:05:21.885563 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885528 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f867cf5f7-6snzj"] Apr 16 22:05:21.885732 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.885715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:21.887720 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.887683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 22:05:21.887790 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.887765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-n7rbz\"" Apr 16 22:05:21.889218 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.889200 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv"] Apr 16 22:05:21.889628 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.889334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:21.894061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.893313 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:05:21.894061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.893566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:05:21.894061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.893574 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6952s\"" Apr 16 22:05:21.894061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.894048 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s42mm"] Apr 16 22:05:21.894303 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.894202 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:21.895068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.894721 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:05:21.895068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.894866 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4"] Apr 16 22:05:21.896730 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.896550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv"] Apr 16 22:05:21.896730 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.896583 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 22:05:21.897448 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.897431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd"] Apr 16 22:05:21.898157 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.898137 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-r2hfz"] Apr 16 22:05:21.899758 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.899739 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:05:21.901072 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.901059 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:21.902954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.902935 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:05:21.903022 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.902957 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:05:21.903022 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.902958 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vts6k\"" Apr 16 22:05:21.910710 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.910671 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r2hfz"] Apr 16 22:05:21.911456 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.911436 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f867cf5f7-6snzj"] Apr 16 22:05:21.957203 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.957183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:21.957309 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.957296 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:21.957353 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.957312 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:21.957353 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.957320 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fvk6 for pod openshift-network-diagnostics/network-check-target-sc5nk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:21.957425 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:21.957364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6 podName:ffbd2631-70f8-45c8-83f0-5e65052e0964 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:53.957350122 +0000 UTC m=+65.430728871 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fvk6" (UniqueName: "kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6") pod "network-check-target-sc5nk" (UID: "ffbd2631-70f8-45c8-83f0-5e65052e0964") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:21.992947 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.992917 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2pb4t"] Apr 16 22:05:21.995958 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.995943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:21.998180 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.998159 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:05:21.998304 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.998210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:05:21.998304 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.998168 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:05:21.998304 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:21.998243 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q2drx\"" Apr 16 22:05:22.003137 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.003119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2pb4t"] Apr 16 22:05:22.058496 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:22.058496 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-certificates\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.058760 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/43fc934b-f73b-46de-8615-4974dcadbf4a-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.058760 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.058760 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.058760 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-bound-sa-token\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.058760 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5hx\" (UniqueName: \"kubernetes.io/projected/188339a6-f380-407b-afe4-3c97b17b4206-kube-api-access-qc5hx\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-ca\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-hub\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvshn\" (UniqueName: \"kubernetes.io/projected/91d334e4-11b9-4900-b5ea-d30c96867f31-kube-api-access-hvshn\") pod \"managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4\" (UID: \"91d334e4-11b9-4900-b5ea-d30c96867f31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-ca-trust-extracted\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/91d334e4-11b9-4900-b5ea-d30c96867f31-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4\" (UID: \"91d334e4-11b9-4900-b5ea-d30c96867f31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vpkq\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-kube-api-access-2vpkq\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.058954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/270e029b-17e5-4312-8fcc-59dfc7eecac7-tmp-dir\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.058973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzqz\" (UniqueName: \"kubernetes.io/projected/270e029b-17e5-4312-8fcc-59dfc7eecac7-kube-api-access-qxzqz\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059057 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/188339a6-f380-407b-afe4-3c97b17b4206-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-image-registry-private-configuration\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43fc934b-f73b-46de-8615-4974dcadbf4a-tmp\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9vj\" (UniqueName: \"kubernetes.io/projected/43fc934b-f73b-46de-8615-4974dcadbf4a-kube-api-access-vm9vj\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-trusted-ca\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-installation-pull-secrets\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.059334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.059300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/270e029b-17e5-4312-8fcc-59dfc7eecac7-config-volume\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.160076 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.159963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-certificates\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160076 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.160076 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5hx\" (UniqueName: \"kubernetes.io/projected/188339a6-f380-407b-afe4-3c97b17b4206-kube-api-access-qc5hx\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.160076 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-hub\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvshn\" (UniqueName: \"kubernetes.io/projected/91d334e4-11b9-4900-b5ea-d30c96867f31-kube-api-access-hvshn\") pod \"managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4\" (UID: \"91d334e4-11b9-4900-b5ea-d30c96867f31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-ca-trust-extracted\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/91d334e4-11b9-4900-b5ea-d30c96867f31-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4\" (UID: \"91d334e4-11b9-4900-b5ea-d30c96867f31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/188339a6-f380-407b-afe4-3c97b17b4206-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43fc934b-f73b-46de-8615-4974dcadbf4a-tmp\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9vj\" (UniqueName: \"kubernetes.io/projected/43fc934b-f73b-46de-8615-4974dcadbf4a-kube-api-access-vm9vj\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-ca\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vpkq\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-kube-api-access-2vpkq\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-ca-trust-extracted\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-installation-pull-secrets\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzqz\" (UniqueName: \"kubernetes.io/projected/270e029b-17e5-4312-8fcc-59dfc7eecac7-kube-api-access-qxzqz\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/270e029b-17e5-4312-8fcc-59dfc7eecac7-tmp-dir\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-image-registry-private-configuration\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.160619 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-trusted-ca\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-certificates\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.160682 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:22.660662657 +0000 UTC m=+34.134041424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/270e029b-17e5-4312-8fcc-59dfc7eecac7-config-volume\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vc9m\" (UniqueName: \"kubernetes.io/projected/97234e5c-490f-432d-a702-1a85fbcc4044-kube-api-access-9vc9m\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:22.160899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/43fc934b-f73b-46de-8615-4974dcadbf4a-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-bound-sa-token\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.160999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/188339a6-f380-407b-afe4-3c97b17b4206-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.161129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43fc934b-f73b-46de-8615-4974dcadbf4a-tmp\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.161202 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.161257 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:22.66123969 +0000 UTC m=+34.134618441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.161250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/270e029b-17e5-4312-8fcc-59dfc7eecac7-tmp-dir\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.161289 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.161304 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:22.161612 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.161349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:05:22.66133201 +0000 UTC m=+34.134710780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:22.162044 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.161804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/270e029b-17e5-4312-8fcc-59dfc7eecac7-config-volume\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.162801 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.162488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-trusted-ca\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.165669 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.165643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.165669 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.165644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-image-registry-private-configuration\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.165855 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.165720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-installation-pull-secrets\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.165855 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.165752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-hub\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.165855 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.165812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/91d334e4-11b9-4900-b5ea-d30c96867f31-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4\" (UID: \"91d334e4-11b9-4900-b5ea-d30c96867f31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.165855 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.165813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/43fc934b-f73b-46de-8615-4974dcadbf4a-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.166095 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.166077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-ca\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.166269 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.166239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:22.166341 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.166325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/188339a6-f380-407b-afe4-3c97b17b4206-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.169740 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.169635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvshn\" (UniqueName: \"kubernetes.io/projected/91d334e4-11b9-4900-b5ea-d30c96867f31-kube-api-access-hvshn\") pod \"managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4\" (UID: \"91d334e4-11b9-4900-b5ea-d30c96867f31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.169740 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.169681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzqz\" (UniqueName: \"kubernetes.io/projected/270e029b-17e5-4312-8fcc-59dfc7eecac7-kube-api-access-qxzqz\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.170076 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.170032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-bound-sa-token\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.170351 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.170254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5hx\" (UniqueName: \"kubernetes.io/projected/188339a6-f380-407b-afe4-3c97b17b4206-kube-api-access-qc5hx\") pod \"cluster-proxy-proxy-agent-7c6df5dc85-sh7qd\" (UID: \"188339a6-f380-407b-afe4-3c97b17b4206\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.172518 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.171940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vpkq\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-kube-api-access-2vpkq\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.173321 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.173303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9vj\" (UniqueName: \"kubernetes.io/projected/43fc934b-f73b-46de-8615-4974dcadbf4a-kube-api-access-vm9vj\") pod \"klusterlet-addon-workmgr-6b58d9997f-6jvpv\" (UID: \"43fc934b-f73b-46de-8615-4974dcadbf4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.211540 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.211494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:05:22.218298 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.218278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" Apr 16 22:05:22.239057 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.239014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:22.261361 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.261329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:22.261510 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.261407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vc9m\" (UniqueName: \"kubernetes.io/projected/97234e5c-490f-432d-a702-1a85fbcc4044-kube-api-access-9vc9m\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:22.261572 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.261512 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:22.261628 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.261618 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:22.761594766 +0000 UTC m=+34.234973513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:22.276264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.276192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vc9m\" (UniqueName: \"kubernetes.io/projected/97234e5c-490f-432d-a702-1a85fbcc4044-kube-api-access-9vc9m\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:22.367598 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.367568 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4"] Apr 16 22:05:22.370393 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.370345 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd"] Apr 16 22:05:22.372735 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:05:22.372703 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91d334e4_11b9_4900_b5ea_d30c96867f31.slice/crio-aaa5ce17853fe7969f479205415a6b01842d28cf8cf5e64c2e2c4ee491cee807 WatchSource:0}: Error finding container aaa5ce17853fe7969f479205415a6b01842d28cf8cf5e64c2e2c4ee491cee807: Status 404 returned error can't find the container with id aaa5ce17853fe7969f479205415a6b01842d28cf8cf5e64c2e2c4ee491cee807 Apr 16 22:05:22.374134 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:05:22.374107 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod188339a6_f380_407b_afe4_3c97b17b4206.slice/crio-5add93556b6fa8b409c5661bf4fd096cf1337444adb47aeade01b0b8b52a2e62 WatchSource:0}: Error finding container 5add93556b6fa8b409c5661bf4fd096cf1337444adb47aeade01b0b8b52a2e62: Status 404 returned error can't find the container with id 5add93556b6fa8b409c5661bf4fd096cf1337444adb47aeade01b0b8b52a2e62 Apr 16 22:05:22.383908 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.383874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" event={"ID":"91d334e4-11b9-4900-b5ea-d30c96867f31","Type":"ContainerStarted","Data":"aaa5ce17853fe7969f479205415a6b01842d28cf8cf5e64c2e2c4ee491cee807"} Apr 16 22:05:22.386193 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.386171 2576 generic.go:358] "Generic (PLEG): container finished" podID="995fbb71-6c0e-4689-8c49-6fd0c1a79f15" containerID="b7813fbdbeb01017970f4f9bf2aff14046ea7364fa5465de46c657490f334da2" exitCode=0 Apr 16 22:05:22.386265 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.386238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerDied","Data":"b7813fbdbeb01017970f4f9bf2aff14046ea7364fa5465de46c657490f334da2"} Apr 16 22:05:22.387318 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.387296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" event={"ID":"188339a6-f380-407b-afe4-3c97b17b4206","Type":"ContainerStarted","Data":"5add93556b6fa8b409c5661bf4fd096cf1337444adb47aeade01b0b8b52a2e62"} Apr 16 22:05:22.395808 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.395786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv"] Apr 16 22:05:22.399473 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:05:22.399449 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fc934b_f73b_46de_8615_4974dcadbf4a.slice/crio-9de81601a643b095b391793dd21204861d307ff82b679c1bcd3acd9ee2c6be08 WatchSource:0}: Error finding container 9de81601a643b095b391793dd21204861d307ff82b679c1bcd3acd9ee2c6be08: Status 404 returned error can't find the container with id 9de81601a643b095b391793dd21204861d307ff82b679c1bcd3acd9ee2c6be08 Apr 16 22:05:22.664271 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.664226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:22.664271 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.664272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.664291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664381 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664401 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664412 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664456 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:05:23.664440584 +0000 UTC m=+35.137819332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664475 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:23.664462104 +0000 UTC m=+35.137840853 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664385 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:22.664521 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.664510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:23.66449991 +0000 UTC m=+35.137878660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:22.765125 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:22.765092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:22.765283 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.765233 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:22.765323 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:22.765302 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:23.76528586 +0000 UTC m=+35.238664608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:23.134075 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.133846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:23.135048 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.134842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:23.135812 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.135792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:23.136602 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.136562 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrzhk\"" Apr 16 22:05:23.136815 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.136797 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:05:23.137084 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.137062 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:05:23.142098 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.142078 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm5fk\"" Apr 16 22:05:23.142293 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.142276 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:05:23.142469 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.142455 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:05:23.394335 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.394243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" event={"ID":"43fc934b-f73b-46de-8615-4974dcadbf4a","Type":"ContainerStarted","Data":"9de81601a643b095b391793dd21204861d307ff82b679c1bcd3acd9ee2c6be08"} Apr 16 22:05:23.671453 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.671281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:23.671621 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.671469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:23.671621 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.671523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:23.671770 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671655 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:23.671770 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671670 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:23.671770 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671746 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:05:25.671725139 +0000 UTC m=+37.145103903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:23.671935 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671843 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:23.671935 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671883 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:25.671871847 +0000 UTC m=+37.145250598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:23.672026 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671944 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:23.672026 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.671976 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:25.67196546 +0000 UTC m=+37.145344215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:23.772589 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:23.772555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:23.772810 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.772791 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:23.772879 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:23.772866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:25.772845804 +0000 UTC m=+37.246224570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:25.692017 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:25.691972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:25.692026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692126 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:25.692138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692150 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.692187044 +0000 UTC m=+41.165565792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692214 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692247 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692278 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.692259853 +0000 UTC m=+41.165638607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:25.692475 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.692309 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.692287894 +0000 UTC m=+41.165666649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:25.793726 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:25.793474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:25.793726 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.793678 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:25.794457 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:25.794081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.794058872 +0000 UTC m=+41.267437634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:27.006516 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:27.006470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:27.010254 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:27.010227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d899808a-e158-4915-b6c2-f135d5b829ef-original-pull-secret\") pod \"global-pull-secret-syncer-c2gt5\" (UID: \"d899808a-e158-4915-b6c2-f135d5b829ef\") " pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:27.069741 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:27.069703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2gt5" Apr 16 22:05:29.728891 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:29.728846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:29.728891 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:29.728900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:29.728970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.728985 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.728995 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.729057 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:05:37.729037662 +0000 UTC m=+49.202416415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.729101 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.729131 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.729161 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:37.729144585 +0000 UTC m=+49.202523347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:29.729405 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.729186 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:37.729174981 +0000 UTC m=+49.202553729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:29.830268 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:29.830234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:29.830437 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.830341 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:29.830437 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:29.830416 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:37.830399316 +0000 UTC m=+49.303778071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:31.367514 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.367476 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c2gt5"] Apr 16 22:05:31.414015 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.413971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" event={"ID":"188339a6-f380-407b-afe4-3c97b17b4206","Type":"ContainerStarted","Data":"55423c74d0f4a379b57121ccf29631b162b676ef075de37196de67425729b34c"} Apr 16 22:05:31.415724 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.415629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" event={"ID":"91d334e4-11b9-4900-b5ea-d30c96867f31","Type":"ContainerStarted","Data":"791e7b3acb5b158006fdeaba0d03464e08eef67f32cad45f8c92af7686e564a9"} Apr 16 22:05:31.417254 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.417225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" event={"ID":"43fc934b-f73b-46de-8615-4974dcadbf4a","Type":"ContainerStarted","Data":"e1c4008079c7f32e6ab978b572ee4b7368736cf79617366fc66cddeeed7c5c79"} Apr 16 22:05:31.417962 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.417947 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:31.419130 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.419108 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:05:31.428823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.428777 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" podStartSLOduration=6.554701239 podStartE2EDuration="15.428761446s" podCreationTimestamp="2026-04-16 22:05:16 +0000 UTC" firstStartedPulling="2026-04-16 22:05:22.374833784 +0000 UTC m=+33.848212545" lastFinishedPulling="2026-04-16 22:05:31.248893992 +0000 UTC m=+42.722272752" observedRunningTime="2026-04-16 22:05:31.428759525 +0000 UTC m=+42.902138295" watchObservedRunningTime="2026-04-16 22:05:31.428761446 +0000 UTC m=+42.902140213" Apr 16 22:05:31.444779 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:31.444733 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" podStartSLOduration=6.581089641 podStartE2EDuration="15.444717025s" podCreationTimestamp="2026-04-16 22:05:16 +0000 UTC" firstStartedPulling="2026-04-16 22:05:22.401754656 +0000 UTC m=+33.875133408" lastFinishedPulling="2026-04-16 22:05:31.265382044 +0000 UTC m=+42.738760792" observedRunningTime="2026-04-16 22:05:31.444508075 +0000 UTC m=+42.917886845" watchObservedRunningTime="2026-04-16 22:05:31.444717025 +0000 UTC m=+42.918095797" Apr 16 22:05:31.450171 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:05:31.450148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd899808a_e158_4915_b6c2_f135d5b829ef.slice/crio-304df35fcf86a7deac8bafe2e45f372f8cf5d3c2ab6920a2a7c8c00cfdd81d59 WatchSource:0}: Error finding container 304df35fcf86a7deac8bafe2e45f372f8cf5d3c2ab6920a2a7c8c00cfdd81d59: Status 404 returned error can't find the container with id 304df35fcf86a7deac8bafe2e45f372f8cf5d3c2ab6920a2a7c8c00cfdd81d59 Apr 16 22:05:32.421102 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:32.421050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c2gt5" event={"ID":"d899808a-e158-4915-b6c2-f135d5b829ef","Type":"ContainerStarted","Data":"304df35fcf86a7deac8bafe2e45f372f8cf5d3c2ab6920a2a7c8c00cfdd81d59"} Apr 16 22:05:32.424772 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:32.424574 2576 generic.go:358] "Generic (PLEG): container finished" podID="995fbb71-6c0e-4689-8c49-6fd0c1a79f15" containerID="611b665a84c8509f555a414c56e5af448498fa86fa83bbcc4f8ff9e86bb7af7f" exitCode=0 Apr 16 22:05:32.424772 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:32.424717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerDied","Data":"611b665a84c8509f555a414c56e5af448498fa86fa83bbcc4f8ff9e86bb7af7f"} Apr 16 22:05:33.429379 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:33.429343 2576 generic.go:358] "Generic (PLEG): container finished" podID="995fbb71-6c0e-4689-8c49-6fd0c1a79f15" containerID="d78b66a56dda4c9aeaf02c5f45d0b9c159811c676cbd8e63aa061683b653f8f7" exitCode=0 Apr 16 22:05:33.429855 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:33.429412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerDied","Data":"d78b66a56dda4c9aeaf02c5f45d0b9c159811c676cbd8e63aa061683b653f8f7"} Apr 16 22:05:34.433752 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:34.433722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gb76" event={"ID":"995fbb71-6c0e-4689-8c49-6fd0c1a79f15","Type":"ContainerStarted","Data":"7a6d680f095335e34e34183f56c0edb82b6c73ba8252546437f5f44e9caad4e6"} Apr 16 22:05:34.435184 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:34.435163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" event={"ID":"188339a6-f380-407b-afe4-3c97b17b4206","Type":"ContainerStarted","Data":"f053765cea4539fe56a5edd823eb3342057a2500030ffc81f9bac93fe6d71232"} Apr 16 22:05:34.454416 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:34.454376 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2gb76" podStartSLOduration=4.373864663 podStartE2EDuration="45.45436473s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:04:50.404134021 +0000 UTC m=+1.877512769" lastFinishedPulling="2026-04-16 22:05:31.484634085 +0000 UTC m=+42.958012836" observedRunningTime="2026-04-16 22:05:34.453210885 +0000 UTC m=+45.926589664" watchObservedRunningTime="2026-04-16 22:05:34.45436473 +0000 UTC m=+45.927743500" Apr 16 22:05:35.439577 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:35.439534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" event={"ID":"188339a6-f380-407b-afe4-3c97b17b4206","Type":"ContainerStarted","Data":"ed7f67d3815414a651f6992084187caa3002570a97f38a3e9d759dc5a7ef51fa"} Apr 16 22:05:35.458330 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:35.458280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" podStartSLOduration=7.609006678 podStartE2EDuration="19.45826392s" podCreationTimestamp="2026-04-16 22:05:16 +0000 UTC" firstStartedPulling="2026-04-16 22:05:22.376201667 +0000 UTC m=+33.849580419" lastFinishedPulling="2026-04-16 22:05:34.22545891 +0000 UTC m=+45.698837661" observedRunningTime="2026-04-16 22:05:35.457577609 +0000 UTC m=+46.930956381" watchObservedRunningTime="2026-04-16 22:05:35.45826392 +0000 UTC m=+46.931642690" Apr 16 22:05:37.444211 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:37.444170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c2gt5" event={"ID":"d899808a-e158-4915-b6c2-f135d5b829ef","Type":"ContainerStarted","Data":"286178c9e51e3d13568096c01da9dbe37aeda508c1de2d4fa105aa3c46f9170d"} Apr 16 22:05:37.463716 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:37.463657 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-c2gt5" podStartSLOduration=37.478651504 podStartE2EDuration="42.463643685s" podCreationTimestamp="2026-04-16 22:04:55 +0000 UTC" firstStartedPulling="2026-04-16 22:05:31.461725851 +0000 UTC m=+42.935104598" lastFinishedPulling="2026-04-16 22:05:36.446718031 +0000 UTC m=+47.920096779" observedRunningTime="2026-04-16 22:05:37.462920422 +0000 UTC m=+48.936299192" watchObservedRunningTime="2026-04-16 22:05:37.463643685 +0000 UTC m=+48.937022454" Apr 16 22:05:37.801179 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:37.801084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:37.801179 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:37.801133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:37.801179 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:37.801151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801242 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801245 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801271 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801245 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801288 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:53.801275442 +0000 UTC m=+65.274654190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801310 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:05:53.801298741 +0000 UTC m=+65.274677490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:37.801423 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.801323 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:53.801317609 +0000 UTC m=+65.274696356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:37.901769 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:37.901723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:37.901942 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.901856 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:37.901942 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:37.901922 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:53.901906401 +0000 UTC m=+65.375285149 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:47.380268 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:47.380236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25llf" Apr 16 22:05:53.821688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:53.821633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:05:53.821688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:53.821723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:53.821754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821828 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821868 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821889 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821900 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821905 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:25.821885508 +0000 UTC m=+97.295264271 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821930 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:06:25.82191736 +0000 UTC m=+97.295296108 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:05:53.822179 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.821944 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:25.821938335 +0000 UTC m=+97.295317082 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:05:53.922260 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:53.922229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:05:53.922393 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:53.922286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:05:53.922393 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.922362 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:53.922476 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.922408 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:25.92239449 +0000 UTC m=+97.395773238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:05:53.925022 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:53.925006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:05:53.932542 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.932526 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:05:53.932611 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:05:53.932572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:57.932559253 +0000 UTC m=+129.405938004 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : secret "metrics-daemon-secret" not found Apr 16 22:05:54.022984 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.022954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:54.025851 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.025834 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:05:54.036109 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.036091 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:05:54.047191 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.047162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fvk6\" (UniqueName: \"kubernetes.io/projected/ffbd2631-70f8-45c8-83f0-5e65052e0964-kube-api-access-7fvk6\") pod \"network-check-target-sc5nk\" (UID: \"ffbd2631-70f8-45c8-83f0-5e65052e0964\") " pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:54.081788 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.081716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm5fk\"" Apr 16 22:05:54.090304 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.090284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:54.201146 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.201115 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sc5nk"] Apr 16 22:05:54.203817 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:05:54.203787 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffbd2631_70f8_45c8_83f0_5e65052e0964.slice/crio-37973e4884df82b9807c7b6bef74ea2260c4fa0ff01a8ba44401abd020721c38 WatchSource:0}: Error finding container 37973e4884df82b9807c7b6bef74ea2260c4fa0ff01a8ba44401abd020721c38: Status 404 returned error can't find the container with id 37973e4884df82b9807c7b6bef74ea2260c4fa0ff01a8ba44401abd020721c38 Apr 16 22:05:54.489832 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:54.489794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sc5nk" event={"ID":"ffbd2631-70f8-45c8-83f0-5e65052e0964","Type":"ContainerStarted","Data":"37973e4884df82b9807c7b6bef74ea2260c4fa0ff01a8ba44401abd020721c38"} Apr 16 22:05:57.500550 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:57.500513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sc5nk" event={"ID":"ffbd2631-70f8-45c8-83f0-5e65052e0964","Type":"ContainerStarted","Data":"3a79802c0e58333e65bb104a72027ad21a32811b8cb7c17e33ce1e4b4be649ea"} Apr 16 22:05:57.500932 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:57.500639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:05:57.514192 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:05:57.514048 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sc5nk" podStartSLOduration=65.732558454 podStartE2EDuration="1m8.514034699s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:05:54.205734151 +0000 UTC m=+65.679112899" lastFinishedPulling="2026-04-16 22:05:56.987210396 +0000 UTC m=+68.460589144" observedRunningTime="2026-04-16 22:05:57.513733361 +0000 UTC m=+68.987112132" watchObservedRunningTime="2026-04-16 22:05:57.514034699 +0000 UTC m=+68.987413448" Apr 16 22:06:25.874007 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:06:25.873968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:06:25.874007 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:06:25.874018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:06:25.874038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874117 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874126 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874148 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f867cf5f7-6snzj: secret "image-registry-tls" not found Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874125 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls podName:270e029b-17e5-4312-8fcc-59dfc7eecac7 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:29.87415872 +0000 UTC m=+161.347537468 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls") pod "dns-default-r2hfz" (UID: "270e029b-17e5-4312-8fcc-59dfc7eecac7") : secret "dns-default-metrics-tls" not found Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874202 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls podName:394a3f59-3a76-4ffe-a23e-d3e08badf2dc nodeName:}" failed. No retries permitted until 2026-04-16 22:07:29.874187502 +0000 UTC m=+161.347566250 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls") pod "image-registry-f867cf5f7-6snzj" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc") : secret "image-registry-tls" not found Apr 16 22:06:25.874519 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.874217 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert podName:0f18a248-9dfa-4c91-b4cd-46c1f19634f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:29.874209264 +0000 UTC m=+161.347588012 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-s42mm" (UID: "0f18a248-9dfa-4c91-b4cd-46c1f19634f1") : secret "networking-console-plugin-cert" not found Apr 16 22:06:25.975334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:06:25.975302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:06:25.975447 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.975435 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:06:25.975503 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:25.975493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert podName:97234e5c-490f-432d-a702-1a85fbcc4044 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:29.975476817 +0000 UTC m=+161.448855577 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert") pod "ingress-canary-2pb4t" (UID: "97234e5c-490f-432d-a702-1a85fbcc4044") : secret "canary-serving-cert" not found Apr 16 22:06:28.505648 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:06:28.505618 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sc5nk" Apr 16 22:06:58.020617 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:06:58.020565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:06:58.021123 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:58.020733 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:06:58.021123 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:06:58.020805 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs podName:6690fd79-9fd1-41a1-acf7-d29fd96d4757 nodeName:}" failed. No retries permitted until 2026-04-16 22:09:00.020785622 +0000 UTC m=+251.494164382 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs") pod "network-metrics-daemon-hzjxc" (UID: "6690fd79-9fd1-41a1-acf7-d29fd96d4757") : secret "metrics-daemon-secret" not found Apr 16 22:07:07.410794 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:07.410763 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4llp9_e04f9b26-0017-48cc-a5f0-a9c2bae5d9df/dns-node-resolver/0.log" Apr 16 22:07:08.810741 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:08.810687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tfhck_03c81f44-bba2-4d54-b6db-157f9d7e76c7/node-ca/0.log" Apr 16 22:07:24.890138 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:24.890077 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" podUID="0f18a248-9dfa-4c91-b4cd-46c1f19634f1" Apr 16 22:07:24.932958 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:24.932925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" Apr 16 22:07:24.944086 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:24.944049 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-r2hfz" podUID="270e029b-17e5-4312-8fcc-59dfc7eecac7" Apr 16 22:07:25.004763 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:25.004715 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2pb4t" podUID="97234e5c-490f-432d-a702-1a85fbcc4044" Apr 16 22:07:25.702264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:25.702223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:07:25.702264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:25.702260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:07:25.702534 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:25.702338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:07:25.702534 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:25.702497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2hfz" Apr 16 22:07:26.158376 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:26.158269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hzjxc" podUID="6690fd79-9fd1-41a1-acf7-d29fd96d4757" Apr 16 22:07:29.966099 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:29.966058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:07:29.966503 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:29.966107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:07:29.966503 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:29.966224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:07:29.968468 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:29.968434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"image-registry-f867cf5f7-6snzj\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:07:29.968576 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:29.968483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/270e029b-17e5-4312-8fcc-59dfc7eecac7-metrics-tls\") pod \"dns-default-r2hfz\" (UID: \"270e029b-17e5-4312-8fcc-59dfc7eecac7\") " pod="openshift-dns/dns-default-r2hfz" Apr 16 22:07:29.968576 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:29.968491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f18a248-9dfa-4c91-b4cd-46c1f19634f1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-s42mm\" (UID: \"0f18a248-9dfa-4c91-b4cd-46c1f19634f1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:07:30.067139 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.067082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:07:30.069412 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.069381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97234e5c-490f-432d-a702-1a85fbcc4044-cert\") pod \"ingress-canary-2pb4t\" (UID: \"97234e5c-490f-432d-a702-1a85fbcc4044\") " pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:07:30.206259 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.206227 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mvhtd\"" Apr 16 22:07:30.206259 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.206243 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vts6k\"" Apr 16 22:07:30.206259 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.206237 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6952s\"" Apr 16 22:07:30.206482 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.206251 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q2drx\"" Apr 16 22:07:30.213329 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.213313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2hfz" Apr 16 22:07:30.213419 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.213352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:07:30.213478 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.213466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" Apr 16 22:07:30.213512 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.213482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pb4t" Apr 16 22:07:30.357068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.357039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r2hfz"] Apr 16 22:07:30.362946 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:07:30.362917 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod270e029b_17e5_4312_8fcc_59dfc7eecac7.slice/crio-d2f0b2b3fa604c8494e7cffa9df9f839d4042a2258eb57624a6b012263dfc036 WatchSource:0}: Error finding container d2f0b2b3fa604c8494e7cffa9df9f839d4042a2258eb57624a6b012263dfc036: Status 404 returned error can't find the container with id d2f0b2b3fa604c8494e7cffa9df9f839d4042a2258eb57624a6b012263dfc036 Apr 16 22:07:30.376032 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.376007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-s42mm"] Apr 16 22:07:30.379470 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:07:30.379434 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f18a248_9dfa_4c91_b4cd_46c1f19634f1.slice/crio-6cbe8c78b6953386abb8d517dee91ceef46694dde1d37c9e6e4f6238faf273a6 WatchSource:0}: Error finding container 6cbe8c78b6953386abb8d517dee91ceef46694dde1d37c9e6e4f6238faf273a6: Status 404 returned error can't find the container with id 6cbe8c78b6953386abb8d517dee91ceef46694dde1d37c9e6e4f6238faf273a6 Apr 16 22:07:30.601867 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.601791 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2pb4t"] Apr 16 22:07:30.605251 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.605222 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f867cf5f7-6snzj"] Apr 16 22:07:30.605449 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:07:30.605421 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97234e5c_490f_432d_a702_1a85fbcc4044.slice/crio-d92d9ab9cb02dce9dbdae6be8a168eaf920fcc278ef70ca0baac5416bc67eb7b WatchSource:0}: Error finding container d92d9ab9cb02dce9dbdae6be8a168eaf920fcc278ef70ca0baac5416bc67eb7b: Status 404 returned error can't find the container with id d92d9ab9cb02dce9dbdae6be8a168eaf920fcc278ef70ca0baac5416bc67eb7b Apr 16 22:07:30.607839 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:07:30.607803 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod394a3f59_3a76_4ffe_a23e_d3e08badf2dc.slice/crio-8ba485c12d8c979194123794dbbf28003fb0aa0dabd92a6c7035240d491b865f WatchSource:0}: Error finding container 8ba485c12d8c979194123794dbbf28003fb0aa0dabd92a6c7035240d491b865f: Status 404 returned error can't find the container with id 8ba485c12d8c979194123794dbbf28003fb0aa0dabd92a6c7035240d491b865f Apr 16 22:07:30.714714 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.714657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" event={"ID":"394a3f59-3a76-4ffe-a23e-d3e08badf2dc","Type":"ContainerStarted","Data":"2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d"} Apr 16 22:07:30.714714 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.714711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" event={"ID":"394a3f59-3a76-4ffe-a23e-d3e08badf2dc","Type":"ContainerStarted","Data":"8ba485c12d8c979194123794dbbf28003fb0aa0dabd92a6c7035240d491b865f"} Apr 16 22:07:30.714961 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.714859 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:07:30.715894 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.715867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2hfz" event={"ID":"270e029b-17e5-4312-8fcc-59dfc7eecac7","Type":"ContainerStarted","Data":"d2f0b2b3fa604c8494e7cffa9df9f839d4042a2258eb57624a6b012263dfc036"} Apr 16 22:07:30.716939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.716916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2pb4t" event={"ID":"97234e5c-490f-432d-a702-1a85fbcc4044","Type":"ContainerStarted","Data":"d92d9ab9cb02dce9dbdae6be8a168eaf920fcc278ef70ca0baac5416bc67eb7b"} Apr 16 22:07:30.717967 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.717943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" event={"ID":"0f18a248-9dfa-4c91-b4cd-46c1f19634f1","Type":"ContainerStarted","Data":"6cbe8c78b6953386abb8d517dee91ceef46694dde1d37c9e6e4f6238faf273a6"} Apr 16 22:07:30.732095 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:30.732042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" podStartSLOduration=161.732026139 podStartE2EDuration="2m41.732026139s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:07:30.731501914 +0000 UTC m=+162.204880684" watchObservedRunningTime="2026-04-16 22:07:30.732026139 +0000 UTC m=+162.205404910" Apr 16 22:07:32.219497 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.219413 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" podUID="91d334e4-11b9-4900-b5ea-d30c96867f31" containerName="addon-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/healthz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 16 22:07:32.240373 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.240324 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" podUID="43fc934b-f73b-46de-8615-4974dcadbf4a" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/healthz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 16 22:07:32.725312 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.725203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" event={"ID":"0f18a248-9dfa-4c91-b4cd-46c1f19634f1","Type":"ContainerStarted","Data":"a919f6f6d51d19a3eb82b49ed7b25ecb84731e87816cc8f5b0e85c4ec3f9256f"} Apr 16 22:07:32.726640 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.726607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2hfz" event={"ID":"270e029b-17e5-4312-8fcc-59dfc7eecac7","Type":"ContainerStarted","Data":"6e676e4354a1413570b155810398a0418ae16902999fd653ede7810fb7a6940e"} Apr 16 22:07:32.726802 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.726650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2hfz" event={"ID":"270e029b-17e5-4312-8fcc-59dfc7eecac7","Type":"ContainerStarted","Data":"11af594306748d8ef79904390ca1d7a2e80dfc9fdeb78e736cfc6468ab4945f6"} Apr 16 22:07:32.726802 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.726740 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-r2hfz" Apr 16 22:07:32.727843 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.727823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2pb4t" event={"ID":"97234e5c-490f-432d-a702-1a85fbcc4044","Type":"ContainerStarted","Data":"c4c1985374d0cb9b839874ea97fe8a7c34382c0d2826a7d3ade60ccb855c7c0c"} Apr 16 22:07:32.729023 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.729003 2576 generic.go:358] "Generic (PLEG): container finished" podID="91d334e4-11b9-4900-b5ea-d30c96867f31" containerID="791e7b3acb5b158006fdeaba0d03464e08eef67f32cad45f8c92af7686e564a9" exitCode=255 Apr 16 22:07:32.729125 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.729055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" event={"ID":"91d334e4-11b9-4900-b5ea-d30c96867f31","Type":"ContainerDied","Data":"791e7b3acb5b158006fdeaba0d03464e08eef67f32cad45f8c92af7686e564a9"} Apr 16 22:07:32.729335 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.729322 2576 scope.go:117] "RemoveContainer" containerID="791e7b3acb5b158006fdeaba0d03464e08eef67f32cad45f8c92af7686e564a9" Apr 16 22:07:32.730292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.730269 2576 generic.go:358] "Generic (PLEG): container finished" podID="43fc934b-f73b-46de-8615-4974dcadbf4a" containerID="e1c4008079c7f32e6ab978b572ee4b7368736cf79617366fc66cddeeed7c5c79" exitCode=1 Apr 16 22:07:32.730377 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.730306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" event={"ID":"43fc934b-f73b-46de-8615-4974dcadbf4a","Type":"ContainerDied","Data":"e1c4008079c7f32e6ab978b572ee4b7368736cf79617366fc66cddeeed7c5c79"} Apr 16 22:07:32.730648 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.730558 2576 scope.go:117] "RemoveContainer" containerID="e1c4008079c7f32e6ab978b572ee4b7368736cf79617366fc66cddeeed7c5c79" Apr 16 22:07:32.740928 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.740880 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-s42mm" podStartSLOduration=158.329667786 podStartE2EDuration="2m39.740863141s" podCreationTimestamp="2026-04-16 22:04:53 +0000 UTC" firstStartedPulling="2026-04-16 22:07:30.38138119 +0000 UTC m=+161.854759941" lastFinishedPulling="2026-04-16 22:07:31.792576542 +0000 UTC m=+163.265955296" observedRunningTime="2026-04-16 22:07:32.740093189 +0000 UTC m=+164.213471959" watchObservedRunningTime="2026-04-16 22:07:32.740863141 +0000 UTC m=+164.214241914" Apr 16 22:07:32.769497 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.769455 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2pb4t" podStartSLOduration=129.918678896 podStartE2EDuration="2m11.76943847s" podCreationTimestamp="2026-04-16 22:05:21 +0000 UTC" firstStartedPulling="2026-04-16 22:07:30.607323584 +0000 UTC m=+162.080702332" lastFinishedPulling="2026-04-16 22:07:32.45808315 +0000 UTC m=+163.931461906" observedRunningTime="2026-04-16 22:07:32.768895333 +0000 UTC m=+164.242274102" watchObservedRunningTime="2026-04-16 22:07:32.76943847 +0000 UTC m=+164.242817242" Apr 16 22:07:32.783454 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:32.783410 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-r2hfz" podStartSLOduration=130.311523582 podStartE2EDuration="2m11.783395024s" podCreationTimestamp="2026-04-16 22:05:21 +0000 UTC" firstStartedPulling="2026-04-16 22:07:30.365337572 +0000 UTC m=+161.838716327" lastFinishedPulling="2026-04-16 22:07:31.837209001 +0000 UTC m=+163.310587769" observedRunningTime="2026-04-16 22:07:32.782825437 +0000 UTC m=+164.256204207" watchObservedRunningTime="2026-04-16 22:07:32.783395024 +0000 UTC m=+164.256773793" Apr 16 22:07:33.734290 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:33.734252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4d7769d7-g9kc4" event={"ID":"91d334e4-11b9-4900-b5ea-d30c96867f31","Type":"ContainerStarted","Data":"0e8ec9c541a67c84e79d64734068524e261a4274d97f8d7a1cf746448cfb0d2f"} Apr 16 22:07:33.735898 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:33.735871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" event={"ID":"43fc934b-f73b-46de-8615-4974dcadbf4a","Type":"ContainerStarted","Data":"2ae60bd1ed6ed2dd0ceb1bd48aa0647cab3f64e49cbcbeec90f2161f76377817"} Apr 16 22:07:33.736536 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:33.736519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:07:33.737164 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:33.737147 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b58d9997f-6jvpv" Apr 16 22:07:36.312123 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.312089 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4x6xj"] Apr 16 22:07:36.316494 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.316472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.320661 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.320636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:07:36.321526 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.321511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:07:36.321624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.321525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wk4hv\"" Apr 16 22:07:36.321624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.321528 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:07:36.325513 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.322135 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:07:36.327989 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.327969 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4x6xj"] Apr 16 22:07:36.417876 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.417836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6687b77-55a8-40de-b6f7-e53478b1e21b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.418068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.417887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6687b77-55a8-40de-b6f7-e53478b1e21b-data-volume\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.418068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.417970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stzp2\" (UniqueName: \"kubernetes.io/projected/e6687b77-55a8-40de-b6f7-e53478b1e21b-kube-api-access-stzp2\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.418068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.418024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6687b77-55a8-40de-b6f7-e53478b1e21b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.418068 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.418058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6687b77-55a8-40de-b6f7-e53478b1e21b-crio-socket\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stzp2\" (UniqueName: \"kubernetes.io/projected/e6687b77-55a8-40de-b6f7-e53478b1e21b-kube-api-access-stzp2\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519054 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6687b77-55a8-40de-b6f7-e53478b1e21b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519286 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6687b77-55a8-40de-b6f7-e53478b1e21b-crio-socket\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519286 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6687b77-55a8-40de-b6f7-e53478b1e21b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519286 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6687b77-55a8-40de-b6f7-e53478b1e21b-data-volume\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519286 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e6687b77-55a8-40de-b6f7-e53478b1e21b-crio-socket\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519570 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e6687b77-55a8-40de-b6f7-e53478b1e21b-data-volume\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.519756 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.519738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e6687b77-55a8-40de-b6f7-e53478b1e21b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.521447 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.521425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e6687b77-55a8-40de-b6f7-e53478b1e21b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.532075 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.532052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stzp2\" (UniqueName: \"kubernetes.io/projected/e6687b77-55a8-40de-b6f7-e53478b1e21b-kube-api-access-stzp2\") pod \"insights-runtime-extractor-4x6xj\" (UID: \"e6687b77-55a8-40de-b6f7-e53478b1e21b\") " pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.628391 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.628301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4x6xj" Apr 16 22:07:36.740298 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:36.740261 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4x6xj"] Apr 16 22:07:36.743806 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:07:36.743779 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6687b77_55a8_40de_b6f7_e53478b1e21b.slice/crio-35a14bd25400007b11f929688b72a2bb4235600bf1c811760622131f28200d6c WatchSource:0}: Error finding container 35a14bd25400007b11f929688b72a2bb4235600bf1c811760622131f28200d6c: Status 404 returned error can't find the container with id 35a14bd25400007b11f929688b72a2bb4235600bf1c811760622131f28200d6c Apr 16 22:07:37.746532 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:37.746495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4x6xj" event={"ID":"e6687b77-55a8-40de-b6f7-e53478b1e21b","Type":"ContainerStarted","Data":"581cfc75171cf3b505225c5df10373bb6c7deed867e0f2525a26e7c32bc37778"} Apr 16 22:07:37.746532 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:37.746532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4x6xj" event={"ID":"e6687b77-55a8-40de-b6f7-e53478b1e21b","Type":"ContainerStarted","Data":"50417c61c8e82f618372a1d261befb973eb6ba83d5122103baf3b6f7c7bb5623"} Apr 16 22:07:37.746532 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:37.746542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4x6xj" event={"ID":"e6687b77-55a8-40de-b6f7-e53478b1e21b","Type":"ContainerStarted","Data":"35a14bd25400007b11f929688b72a2bb4235600bf1c811760622131f28200d6c"} Apr 16 22:07:38.132936 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:38.132847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:07:39.753081 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:39.753049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4x6xj" event={"ID":"e6687b77-55a8-40de-b6f7-e53478b1e21b","Type":"ContainerStarted","Data":"47ac09ec049c6fbed6745c9a13477114d987289375ee5b517b3403be0461d8db"} Apr 16 22:07:39.769383 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:39.769337 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4x6xj" podStartSLOduration=1.805282807 podStartE2EDuration="3.769323616s" podCreationTimestamp="2026-04-16 22:07:36 +0000 UTC" firstStartedPulling="2026-04-16 22:07:36.805050789 +0000 UTC m=+168.278429542" lastFinishedPulling="2026-04-16 22:07:38.769091602 +0000 UTC m=+170.242470351" observedRunningTime="2026-04-16 22:07:39.767777651 +0000 UTC m=+171.241156422" watchObservedRunningTime="2026-04-16 22:07:39.769323616 +0000 UTC m=+171.242702387" Apr 16 22:07:42.738730 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:42.738667 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-r2hfz" Apr 16 22:07:44.833079 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.833046 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-69m6w"] Apr 16 22:07:44.836800 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.836778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.839074 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839052 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:07:44.839175 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:07:44.839291 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839271 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:07:44.839407 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:07:44.839769 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:07:44.839882 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839759 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gz4kv\"" Apr 16 22:07:44.839882 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.839858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:07:44.884703 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-wtmp\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.884875 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-sys\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.884875 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-textfile\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.884875 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d91095be-a477-4c1e-bd66-36fd94142428-metrics-client-ca\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.884875 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.884875 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddz48\" (UniqueName: \"kubernetes.io/projected/d91095be-a477-4c1e-bd66-36fd94142428-kube-api-access-ddz48\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.885093 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-root\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.885093 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-accelerators-collector-config\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.885093 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.884950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-tls\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986116 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddz48\" (UniqueName: \"kubernetes.io/projected/d91095be-a477-4c1e-bd66-36fd94142428-kube-api-access-ddz48\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986116 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-root\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-accelerators-collector-config\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-root\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-tls\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-wtmp\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-sys\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986370 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-textfile\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986636 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d91095be-a477-4c1e-bd66-36fd94142428-metrics-client-ca\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986636 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:44.986391 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:07:44.986636 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986636 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-sys\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986636 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:07:44.986482 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-tls podName:d91095be-a477-4c1e-bd66-36fd94142428 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:45.486461936 +0000 UTC m=+176.959840703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-tls") pod "node-exporter-69m6w" (UID: "d91095be-a477-4c1e-bd66-36fd94142428") : secret "node-exporter-tls" not found Apr 16 22:07:44.986636 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-wtmp\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986857 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-accelerators-collector-config\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986857 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-textfile\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.986943 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.986927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d91095be-a477-4c1e-bd66-36fd94142428-metrics-client-ca\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.988758 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.988738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:44.996194 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:44.996169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddz48\" (UniqueName: \"kubernetes.io/projected/d91095be-a477-4c1e-bd66-36fd94142428-kube-api-access-ddz48\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:45.490840 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:45.490780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-tls\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:45.493138 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:45.493110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91095be-a477-4c1e-bd66-36fd94142428-node-exporter-tls\") pod \"node-exporter-69m6w\" (UID: \"d91095be-a477-4c1e-bd66-36fd94142428\") " pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:45.746406 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:45.746320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-69m6w" Apr 16 22:07:45.754341 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:07:45.754311 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd91095be_a477_4c1e_bd66_36fd94142428.slice/crio-94a7bdba05a14bf46e9efb352d29e089072bb58b29a97ec1f92b12a3da221c51 WatchSource:0}: Error finding container 94a7bdba05a14bf46e9efb352d29e089072bb58b29a97ec1f92b12a3da221c51: Status 404 returned error can't find the container with id 94a7bdba05a14bf46e9efb352d29e089072bb58b29a97ec1f92b12a3da221c51 Apr 16 22:07:45.770369 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:45.770341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69m6w" event={"ID":"d91095be-a477-4c1e-bd66-36fd94142428","Type":"ContainerStarted","Data":"94a7bdba05a14bf46e9efb352d29e089072bb58b29a97ec1f92b12a3da221c51"} Apr 16 22:07:46.774089 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:46.773998 2576 generic.go:358] "Generic (PLEG): container finished" podID="d91095be-a477-4c1e-bd66-36fd94142428" containerID="b34364f1f3e0cbbdce9025524d18478f767641cb41e963145676c8f011b16e0e" exitCode=0 Apr 16 22:07:46.774089 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:46.774041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69m6w" event={"ID":"d91095be-a477-4c1e-bd66-36fd94142428","Type":"ContainerDied","Data":"b34364f1f3e0cbbdce9025524d18478f767641cb41e963145676c8f011b16e0e"} Apr 16 22:07:47.777805 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:47.777763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69m6w" event={"ID":"d91095be-a477-4c1e-bd66-36fd94142428","Type":"ContainerStarted","Data":"4a78223742e2452a47c901c6a7b9317340507212a5b93e0423aa5ae11188c6dc"} Apr 16 22:07:47.777805 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:47.777811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69m6w" event={"ID":"d91095be-a477-4c1e-bd66-36fd94142428","Type":"ContainerStarted","Data":"8c71afce9b861c19cf79e607cb5b3cdd6a6735d72ecdaaafc508ce015d421f7c"} Apr 16 22:07:47.795518 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:47.795466 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-69m6w" podStartSLOduration=3.025789787 podStartE2EDuration="3.79545079s" podCreationTimestamp="2026-04-16 22:07:44 +0000 UTC" firstStartedPulling="2026-04-16 22:07:45.756142083 +0000 UTC m=+177.229520831" lastFinishedPulling="2026-04-16 22:07:46.525803084 +0000 UTC m=+177.999181834" observedRunningTime="2026-04-16 22:07:47.794408401 +0000 UTC m=+179.267787172" watchObservedRunningTime="2026-04-16 22:07:47.79545079 +0000 UTC m=+179.268829559" Apr 16 22:07:50.217436 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:50.217402 2576 patch_prober.go:28] interesting pod/image-registry-f867cf5f7-6snzj container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:07:50.217880 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:50.217457 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:07:51.726061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:51.726032 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:07:58.846391 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:07:58.846356 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f867cf5f7-6snzj"] Apr 16 22:08:02.213470 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:02.213427 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" podUID="188339a6-f380-407b-afe4-3c97b17b4206" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:08:12.213125 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:12.213082 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" podUID="188339a6-f380-407b-afe4-3c97b17b4206" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:08:19.055716 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:19.055665 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69m6w_d91095be-a477-4c1e-bd66-36fd94142428/init-textfile/0.log" Apr 16 22:08:19.257033 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:19.257004 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69m6w_d91095be-a477-4c1e-bd66-36fd94142428/node-exporter/0.log" Apr 16 22:08:19.455122 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:19.455091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69m6w_d91095be-a477-4c1e-bd66-36fd94142428/kube-rbac-proxy/0.log" Apr 16 22:08:22.212942 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.212896 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" podUID="188339a6-f380-407b-afe4-3c97b17b4206" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:08:22.213294 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.212990 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" Apr 16 22:08:22.213595 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.213561 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ed7f67d3815414a651f6992084187caa3002570a97f38a3e9d759dc5a7ef51fa"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 22:08:22.213632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.213619 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" podUID="188339a6-f380-407b-afe4-3c97b17b4206" containerName="service-proxy" containerID="cri-o://ed7f67d3815414a651f6992084187caa3002570a97f38a3e9d759dc5a7ef51fa" gracePeriod=30 Apr 16 22:08:22.870922 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.870890 2576 generic.go:358] "Generic (PLEG): container finished" podID="188339a6-f380-407b-afe4-3c97b17b4206" containerID="ed7f67d3815414a651f6992084187caa3002570a97f38a3e9d759dc5a7ef51fa" exitCode=2 Apr 16 22:08:22.871084 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.870955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" event={"ID":"188339a6-f380-407b-afe4-3c97b17b4206","Type":"ContainerDied","Data":"ed7f67d3815414a651f6992084187caa3002570a97f38a3e9d759dc5a7ef51fa"} Apr 16 22:08:22.871084 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:22.870990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c6df5dc85-sh7qd" event={"ID":"188339a6-f380-407b-afe4-3c97b17b4206","Type":"ContainerStarted","Data":"f1471cd189512d3c37d5c3508c98e3597e50d7f1e6b779e3e57d5cf60f1d00fe"} Apr 16 22:08:23.865449 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:23.865385 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" containerName="registry" containerID="cri-o://2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d" gracePeriod=30 Apr 16 22:08:24.097632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.097604 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:08:24.190905 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.190876 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191067 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.190921 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-installation-pull-secrets\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191067 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.190952 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-image-registry-private-configuration\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191067 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.190973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-ca-trust-extracted\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191244 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.191067 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-bound-sa-token\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191244 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.191135 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vpkq\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-kube-api-access-2vpkq\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191244 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.191173 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-trusted-ca\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.191244 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.191200 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-certificates\") pod \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\" (UID: \"394a3f59-3a76-4ffe-a23e-d3e08badf2dc\") " Apr 16 22:08:24.192547 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.192343 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:24.192547 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.192450 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:24.196013 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.194244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:08:24.196013 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.194384 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:24.198657 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.198428 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:24.198657 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.198457 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:08:24.199274 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.199129 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-kube-api-access-2vpkq" (OuterVolumeSpecName: "kube-api-access-2vpkq") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "kube-api-access-2vpkq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:24.202814 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.202791 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "394a3f59-3a76-4ffe-a23e-d3e08badf2dc" (UID: "394a3f59-3a76-4ffe-a23e-d3e08badf2dc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:08:24.292939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292890 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vpkq\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-kube-api-access-2vpkq\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.292939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292931 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-trusted-ca\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.292939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292944 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-certificates\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.292939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292953 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-registry-tls\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.292939 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292962 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-installation-pull-secrets\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.293219 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292972 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-image-registry-private-configuration\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.293219 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292982 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-ca-trust-extracted\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.293219 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.292990 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/394a3f59-3a76-4ffe-a23e-d3e08badf2dc-bound-sa-token\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:08:24.655867 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.655783 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-s42mm_0f18a248-9dfa-4c91-b4cd-46c1f19634f1/networking-console-plugin/0.log" Apr 16 22:08:24.877790 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.877756 2576 generic.go:358] "Generic (PLEG): container finished" podID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" containerID="2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d" exitCode=0 Apr 16 22:08:24.878264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.877828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" event={"ID":"394a3f59-3a76-4ffe-a23e-d3e08badf2dc","Type":"ContainerDied","Data":"2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d"} Apr 16 22:08:24.878264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.877858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" event={"ID":"394a3f59-3a76-4ffe-a23e-d3e08badf2dc","Type":"ContainerDied","Data":"8ba485c12d8c979194123794dbbf28003fb0aa0dabd92a6c7035240d491b865f"} Apr 16 22:08:24.878264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.877858 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f867cf5f7-6snzj" Apr 16 22:08:24.878264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.877876 2576 scope.go:117] "RemoveContainer" containerID="2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d" Apr 16 22:08:24.887483 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.887405 2576 scope.go:117] "RemoveContainer" containerID="2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d" Apr 16 22:08:24.888036 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:08:24.888015 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d\": container with ID starting with 2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d not found: ID does not exist" containerID="2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d" Apr 16 22:08:24.888102 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.888048 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d"} err="failed to get container status \"2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d\": rpc error: code = NotFound desc = could not find container \"2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d\": container with ID starting with 2c538558048a3aed449e7d23e082a3d3a079ce2a1756edd143864a179f380f7d not found: ID does not exist" Apr 16 22:08:24.898071 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.898040 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f867cf5f7-6snzj"] Apr 16 22:08:24.903392 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:24.903374 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-f867cf5f7-6snzj"] Apr 16 22:08:25.136680 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:25.136644 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" path="/var/lib/kubelet/pods/394a3f59-3a76-4ffe-a23e-d3e08badf2dc/volumes" Apr 16 22:08:26.055462 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:08:26.055432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2pb4t_97234e5c-490f-432d-a702-1a85fbcc4044/serve-healthcheck-canary/0.log" Apr 16 22:09:00.051958 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:00.051910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:09:00.054143 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:00.054124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6690fd79-9fd1-41a1-acf7-d29fd96d4757-metrics-certs\") pod \"network-metrics-daemon-hzjxc\" (UID: \"6690fd79-9fd1-41a1-acf7-d29fd96d4757\") " pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:09:00.336596 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:00.336500 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrzhk\"" Apr 16 22:09:00.345185 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:00.345158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzjxc" Apr 16 22:09:00.463595 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:00.463551 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hzjxc"] Apr 16 22:09:00.466301 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:09:00.466271 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6690fd79_9fd1_41a1_acf7_d29fd96d4757.slice/crio-f491f4c0db072b1dc9551f6f8e934c40ef56fe004320f86c982be56a21a230a9 WatchSource:0}: Error finding container f491f4c0db072b1dc9551f6f8e934c40ef56fe004320f86c982be56a21a230a9: Status 404 returned error can't find the container with id f491f4c0db072b1dc9551f6f8e934c40ef56fe004320f86c982be56a21a230a9 Apr 16 22:09:00.972383 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:00.972337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hzjxc" event={"ID":"6690fd79-9fd1-41a1-acf7-d29fd96d4757","Type":"ContainerStarted","Data":"f491f4c0db072b1dc9551f6f8e934c40ef56fe004320f86c982be56a21a230a9"} Apr 16 22:09:01.976101 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:01.976064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hzjxc" event={"ID":"6690fd79-9fd1-41a1-acf7-d29fd96d4757","Type":"ContainerStarted","Data":"16df5aa148c15c525f86feb3dfc558a9ef2c483694f2b3d2ecfa7e8a1cb5791d"} Apr 16 22:09:01.976101 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:01.976107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hzjxc" event={"ID":"6690fd79-9fd1-41a1-acf7-d29fd96d4757","Type":"ContainerStarted","Data":"180b06f7c8ffe6d64cf1d1fdaf212f5b719eea4c6d2c45b27dcb8847abcc9b0b"} Apr 16 22:09:01.990525 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:01.990472 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hzjxc" podStartSLOduration=251.937153963 podStartE2EDuration="4m12.990457647s" podCreationTimestamp="2026-04-16 22:04:49 +0000 UTC" firstStartedPulling="2026-04-16 22:09:00.468027964 +0000 UTC m=+251.941406713" lastFinishedPulling="2026-04-16 22:09:01.521331631 +0000 UTC m=+252.994710397" observedRunningTime="2026-04-16 22:09:01.989625298 +0000 UTC m=+253.463004067" watchObservedRunningTime="2026-04-16 22:09:01.990457647 +0000 UTC m=+253.463836417" Apr 16 22:09:49.002434 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:49.002401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:09:49.003052 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:49.002477 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:09:49.008898 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:09:49.008881 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:11:25.129994 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.129959 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk"] Apr 16 22:11:25.130374 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.130212 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" containerName="registry" Apr 16 22:11:25.130374 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.130223 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" containerName="registry" Apr 16 22:11:25.130374 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.130271 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="394a3f59-3a76-4ffe-a23e-d3e08badf2dc" containerName="registry" Apr 16 22:11:25.132790 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.132765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.135002 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.134969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:11:25.135119 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.135100 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v84j8\"" Apr 16 22:11:25.135752 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.135732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:11:25.139856 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.139830 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk"] Apr 16 22:11:25.251184 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.251154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.251184 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.251189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.251431 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.251224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgfm\" (UniqueName: \"kubernetes.io/projected/9751d54b-c6cc-457e-a723-019fb03649fe-kube-api-access-mdgfm\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.351767 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.351740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.351767 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.351769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.351956 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.351792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgfm\" (UniqueName: \"kubernetes.io/projected/9751d54b-c6cc-457e-a723-019fb03649fe-kube-api-access-mdgfm\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.352124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.352107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.352168 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.352142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.363612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.363578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgfm\" (UniqueName: \"kubernetes.io/projected/9751d54b-c6cc-457e-a723-019fb03649fe-kube-api-access-mdgfm\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.442411 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.442358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:25.558763 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.558734 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk"] Apr 16 22:11:25.561728 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:11:25.561682 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9751d54b_c6cc_457e_a723_019fb03649fe.slice/crio-1f37ceb870f5aa43de6e846489d0199964e7c6d992d9a096c2fe29b7b1239ca6 WatchSource:0}: Error finding container 1f37ceb870f5aa43de6e846489d0199964e7c6d992d9a096c2fe29b7b1239ca6: Status 404 returned error can't find the container with id 1f37ceb870f5aa43de6e846489d0199964e7c6d992d9a096c2fe29b7b1239ca6 Apr 16 22:11:25.563469 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:25.563450 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:11:26.346664 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:26.346627 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" event={"ID":"9751d54b-c6cc-457e-a723-019fb03649fe","Type":"ContainerStarted","Data":"1f37ceb870f5aa43de6e846489d0199964e7c6d992d9a096c2fe29b7b1239ca6"} Apr 16 22:11:30.358067 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:30.358032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" event={"ID":"9751d54b-c6cc-457e-a723-019fb03649fe","Type":"ContainerStarted","Data":"e92872b84de20faa2f4cd2c2c4c8d330d563ec15217b9957fad8a7c09f088e3d"} Apr 16 22:11:31.361640 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:31.361600 2576 generic.go:358] "Generic (PLEG): container finished" podID="9751d54b-c6cc-457e-a723-019fb03649fe" containerID="e92872b84de20faa2f4cd2c2c4c8d330d563ec15217b9957fad8a7c09f088e3d" exitCode=0 Apr 16 22:11:31.362211 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:31.361671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" event={"ID":"9751d54b-c6cc-457e-a723-019fb03649fe","Type":"ContainerDied","Data":"e92872b84de20faa2f4cd2c2c4c8d330d563ec15217b9957fad8a7c09f088e3d"} Apr 16 22:11:34.372296 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:34.372263 2576 generic.go:358] "Generic (PLEG): container finished" podID="9751d54b-c6cc-457e-a723-019fb03649fe" containerID="31c3bb591d9fd3b61072628a4b6b3b7655338c0aae8db93fbf4637fd4cbce1c7" exitCode=0 Apr 16 22:11:34.372624 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:34.372303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" event={"ID":"9751d54b-c6cc-457e-a723-019fb03649fe","Type":"ContainerDied","Data":"31c3bb591d9fd3b61072628a4b6b3b7655338c0aae8db93fbf4637fd4cbce1c7"} Apr 16 22:11:40.394555 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:40.394460 2576 generic.go:358] "Generic (PLEG): container finished" podID="9751d54b-c6cc-457e-a723-019fb03649fe" containerID="d6ef52fadba206ed1c3a8a860e35b741b1991e1766f2d829545e967b53ed1f8c" exitCode=0 Apr 16 22:11:40.394555 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:40.394540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" event={"ID":"9751d54b-c6cc-457e-a723-019fb03649fe","Type":"ContainerDied","Data":"d6ef52fadba206ed1c3a8a860e35b741b1991e1766f2d829545e967b53ed1f8c"} Apr 16 22:11:41.513823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.513799 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:41.681074 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.680997 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-bundle\") pod \"9751d54b-c6cc-457e-a723-019fb03649fe\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " Apr 16 22:11:41.681074 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.681052 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-util\") pod \"9751d54b-c6cc-457e-a723-019fb03649fe\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " Apr 16 22:11:41.681246 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.681111 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdgfm\" (UniqueName: \"kubernetes.io/projected/9751d54b-c6cc-457e-a723-019fb03649fe-kube-api-access-mdgfm\") pod \"9751d54b-c6cc-457e-a723-019fb03649fe\" (UID: \"9751d54b-c6cc-457e-a723-019fb03649fe\") " Apr 16 22:11:41.681428 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.681404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-bundle" (OuterVolumeSpecName: "bundle") pod "9751d54b-c6cc-457e-a723-019fb03649fe" (UID: "9751d54b-c6cc-457e-a723-019fb03649fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:11:41.683336 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.683313 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9751d54b-c6cc-457e-a723-019fb03649fe-kube-api-access-mdgfm" (OuterVolumeSpecName: "kube-api-access-mdgfm") pod "9751d54b-c6cc-457e-a723-019fb03649fe" (UID: "9751d54b-c6cc-457e-a723-019fb03649fe"). InnerVolumeSpecName "kube-api-access-mdgfm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:11:41.685711 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.685673 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-util" (OuterVolumeSpecName: "util") pod "9751d54b-c6cc-457e-a723-019fb03649fe" (UID: "9751d54b-c6cc-457e-a723-019fb03649fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:11:41.782320 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.782287 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-bundle\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:11:41.782320 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.782313 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9751d54b-c6cc-457e-a723-019fb03649fe-util\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:11:41.782320 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:41.782322 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdgfm\" (UniqueName: \"kubernetes.io/projected/9751d54b-c6cc-457e-a723-019fb03649fe-kube-api-access-mdgfm\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:11:42.401127 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:42.401089 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" event={"ID":"9751d54b-c6cc-457e-a723-019fb03649fe","Type":"ContainerDied","Data":"1f37ceb870f5aa43de6e846489d0199964e7c6d992d9a096c2fe29b7b1239ca6"} Apr 16 22:11:42.401127 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:42.401124 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f37ceb870f5aa43de6e846489d0199964e7c6d992d9a096c2fe29b7b1239ca6" Apr 16 22:11:42.401127 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:42.401131 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqvhfk" Apr 16 22:11:45.093979 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.093945 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-t9lgs"] Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094162 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="pull" Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094173 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="pull" Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094191 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="extract" Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094197 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="extract" Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094204 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="util" Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094209 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="util" Apr 16 22:11:45.094340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.094243 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9751d54b-c6cc-457e-a723-019fb03649fe" containerName="extract" Apr 16 22:11:45.100724 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.100708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.102816 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.102791 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-6cmrl\"" Apr 16 22:11:45.103056 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.102809 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 22:11:45.103161 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.103054 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-t9lgs"] Apr 16 22:11:45.103396 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.103378 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 22:11:45.205819 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.205776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3-bound-sa-token\") pod \"cert-manager-759f64656b-t9lgs\" (UID: \"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3\") " pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.206002 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.205833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wd5\" (UniqueName: \"kubernetes.io/projected/5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3-kube-api-access-w9wd5\") pod \"cert-manager-759f64656b-t9lgs\" (UID: \"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3\") " pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.306334 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.306283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3-bound-sa-token\") pod \"cert-manager-759f64656b-t9lgs\" (UID: \"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3\") " pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.306477 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.306349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wd5\" (UniqueName: \"kubernetes.io/projected/5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3-kube-api-access-w9wd5\") pod \"cert-manager-759f64656b-t9lgs\" (UID: \"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3\") " pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.313954 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.313931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3-bound-sa-token\") pod \"cert-manager-759f64656b-t9lgs\" (UID: \"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3\") " pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.314144 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.314127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wd5\" (UniqueName: \"kubernetes.io/projected/5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3-kube-api-access-w9wd5\") pod \"cert-manager-759f64656b-t9lgs\" (UID: \"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3\") " pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.409729 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.409631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-t9lgs" Apr 16 22:11:45.525194 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:45.525162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-t9lgs"] Apr 16 22:11:45.528078 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:11:45.528047 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5af6b4f0_c7b2_4aec_99f9_e6104d8b73b3.slice/crio-13de7219ca259277a7c5a79b7216849f685094a909e31a7a848f3f27a9756ad7 WatchSource:0}: Error finding container 13de7219ca259277a7c5a79b7216849f685094a909e31a7a848f3f27a9756ad7: Status 404 returned error can't find the container with id 13de7219ca259277a7c5a79b7216849f685094a909e31a7a848f3f27a9756ad7 Apr 16 22:11:46.414086 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:46.414028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-t9lgs" event={"ID":"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3","Type":"ContainerStarted","Data":"13de7219ca259277a7c5a79b7216849f685094a909e31a7a848f3f27a9756ad7"} Apr 16 22:11:49.130980 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.130949 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7"] Apr 16 22:11:49.134203 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.134178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.136058 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.136038 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 22:11:49.136804 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.136776 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:11:49.136915 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.136861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-8wbg6\"" Apr 16 22:11:49.139925 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.139875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7"] Apr 16 22:11:49.238143 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.238109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd24p\" (UniqueName: \"kubernetes.io/projected/14d40e20-7602-4045-95a7-0b73bf25f04e-kube-api-access-cd24p\") pod \"openshift-lws-operator-bfc7f696d-q4fq7\" (UID: \"14d40e20-7602-4045-95a7-0b73bf25f04e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.238302 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.238151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14d40e20-7602-4045-95a7-0b73bf25f04e-tmp\") pod \"openshift-lws-operator-bfc7f696d-q4fq7\" (UID: \"14d40e20-7602-4045-95a7-0b73bf25f04e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.339235 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.339200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd24p\" (UniqueName: \"kubernetes.io/projected/14d40e20-7602-4045-95a7-0b73bf25f04e-kube-api-access-cd24p\") pod \"openshift-lws-operator-bfc7f696d-q4fq7\" (UID: \"14d40e20-7602-4045-95a7-0b73bf25f04e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.339397 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.339242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14d40e20-7602-4045-95a7-0b73bf25f04e-tmp\") pod \"openshift-lws-operator-bfc7f696d-q4fq7\" (UID: \"14d40e20-7602-4045-95a7-0b73bf25f04e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.339584 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.339560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/14d40e20-7602-4045-95a7-0b73bf25f04e-tmp\") pod \"openshift-lws-operator-bfc7f696d-q4fq7\" (UID: \"14d40e20-7602-4045-95a7-0b73bf25f04e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.346559 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.346529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd24p\" (UniqueName: \"kubernetes.io/projected/14d40e20-7602-4045-95a7-0b73bf25f04e-kube-api-access-cd24p\") pod \"openshift-lws-operator-bfc7f696d-q4fq7\" (UID: \"14d40e20-7602-4045-95a7-0b73bf25f04e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.423000 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.422919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-t9lgs" event={"ID":"5af6b4f0-c7b2-4aec-99f9-e6104d8b73b3","Type":"ContainerStarted","Data":"46e4678c22f7dcec28ce995375fe54d76c8d8827cfce43049fc76c84fb453d25"} Apr 16 22:11:49.437061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.437015 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-t9lgs" podStartSLOduration=1.573750363 podStartE2EDuration="4.437000548s" podCreationTimestamp="2026-04-16 22:11:45 +0000 UTC" firstStartedPulling="2026-04-16 22:11:45.529835837 +0000 UTC m=+417.003214588" lastFinishedPulling="2026-04-16 22:11:48.393086021 +0000 UTC m=+419.866464773" observedRunningTime="2026-04-16 22:11:49.435759958 +0000 UTC m=+420.909138728" watchObservedRunningTime="2026-04-16 22:11:49.437000548 +0000 UTC m=+420.910379317" Apr 16 22:11:49.443964 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.443936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" Apr 16 22:11:49.561831 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:49.561809 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7"] Apr 16 22:11:49.564302 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:11:49.564272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14d40e20_7602_4045_95a7_0b73bf25f04e.slice/crio-755ace0435d45ade3032663a7a6c2cdf1a97d61ace46d9b30922916dbda41a60 WatchSource:0}: Error finding container 755ace0435d45ade3032663a7a6c2cdf1a97d61ace46d9b30922916dbda41a60: Status 404 returned error can't find the container with id 755ace0435d45ade3032663a7a6c2cdf1a97d61ace46d9b30922916dbda41a60 Apr 16 22:11:50.426938 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:50.426897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" event={"ID":"14d40e20-7602-4045-95a7-0b73bf25f04e","Type":"ContainerStarted","Data":"755ace0435d45ade3032663a7a6c2cdf1a97d61ace46d9b30922916dbda41a60"} Apr 16 22:11:52.433921 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:52.433881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" event={"ID":"14d40e20-7602-4045-95a7-0b73bf25f04e","Type":"ContainerStarted","Data":"76cbb3e5876cbfa0efc2042c7fbb985114a7e90cd33effbbe131da9d9906d529"} Apr 16 22:11:52.450049 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:11:52.449998 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-q4fq7" podStartSLOduration=1.355288673 podStartE2EDuration="3.449980217s" podCreationTimestamp="2026-04-16 22:11:49 +0000 UTC" firstStartedPulling="2026-04-16 22:11:49.565806976 +0000 UTC m=+421.039185724" lastFinishedPulling="2026-04-16 22:11:51.66049852 +0000 UTC m=+423.133877268" observedRunningTime="2026-04-16 22:11:52.447612841 +0000 UTC m=+423.920991610" watchObservedRunningTime="2026-04-16 22:11:52.449980217 +0000 UTC m=+423.923358989" Apr 16 22:12:12.123761 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.123663 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d"] Apr 16 22:12:12.153239 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.153207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d"] Apr 16 22:12:12.153427 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.153349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.156375 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.156344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 22:12:12.156579 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.156557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 22:12:12.156764 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.156746 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-52lvq\"" Apr 16 22:12:12.157299 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.157283 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 22:12:12.157416 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.157400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 22:12:12.203964 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.203933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl5j8\" (UniqueName: \"kubernetes.io/projected/42fa1204-d28a-413e-9cb1-ad8db42994af-kube-api-access-wl5j8\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.204107 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.203977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42fa1204-d28a-413e-9cb1-ad8db42994af-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.204107 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.204062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42fa1204-d28a-413e-9cb1-ad8db42994af-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.304902 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.304864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wl5j8\" (UniqueName: \"kubernetes.io/projected/42fa1204-d28a-413e-9cb1-ad8db42994af-kube-api-access-wl5j8\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.305110 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.304911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42fa1204-d28a-413e-9cb1-ad8db42994af-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.305110 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.304937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42fa1204-d28a-413e-9cb1-ad8db42994af-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.307429 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.307404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42fa1204-d28a-413e-9cb1-ad8db42994af-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.307429 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.307423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42fa1204-d28a-413e-9cb1-ad8db42994af-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.312182 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.312153 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl5j8\" (UniqueName: \"kubernetes.io/projected/42fa1204-d28a-413e-9cb1-ad8db42994af-kube-api-access-wl5j8\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-h657d\" (UID: \"42fa1204-d28a-413e-9cb1-ad8db42994af\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.463522 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.463485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:12.588811 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.588674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d"] Apr 16 22:12:12.591655 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:12.591623 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42fa1204_d28a_413e_9cb1_ad8db42994af.slice/crio-cab4e54453002b632d296c6ef2fce4e645d9349b891fa18e9867d0e40ab254d4 WatchSource:0}: Error finding container cab4e54453002b632d296c6ef2fce4e645d9349b891fa18e9867d0e40ab254d4: Status 404 returned error can't find the container with id cab4e54453002b632d296c6ef2fce4e645d9349b891fa18e9867d0e40ab254d4 Apr 16 22:12:12.614944 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.614909 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46"] Apr 16 22:12:12.645711 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.645674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46"] Apr 16 22:12:12.645858 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.645815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.648101 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.648083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:12:12.648213 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.648133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v84j8\"" Apr 16 22:12:12.648213 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.648203 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:12:12.708817 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.708772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.708998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.708827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.708998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.708913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8lc9\" (UniqueName: \"kubernetes.io/projected/4bb42bb7-bd6c-45ec-b53b-90a39930e994-kube-api-access-p8lc9\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.809485 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.809408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.809485 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.809462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.809619 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.809512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8lc9\" (UniqueName: \"kubernetes.io/projected/4bb42bb7-bd6c-45ec-b53b-90a39930e994-kube-api-access-p8lc9\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.809835 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.809814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.809901 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.809879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.817308 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.817278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8lc9\" (UniqueName: \"kubernetes.io/projected/4bb42bb7-bd6c-45ec-b53b-90a39930e994-kube-api-access-p8lc9\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:12.954782 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:12.954739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:13.075832 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:13.075625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46"] Apr 16 22:12:13.078435 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:13.078407 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bb42bb7_bd6c_45ec_b53b_90a39930e994.slice/crio-4de5ecff7d26ca0766b72c6286ffa3439d063e743ed2a591d45b4c1023750049 WatchSource:0}: Error finding container 4de5ecff7d26ca0766b72c6286ffa3439d063e743ed2a591d45b4c1023750049: Status 404 returned error can't find the container with id 4de5ecff7d26ca0766b72c6286ffa3439d063e743ed2a591d45b4c1023750049 Apr 16 22:12:13.496088 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:13.496045 2576 generic.go:358] "Generic (PLEG): container finished" podID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerID="2b4d8d50a48a38017f26dbe13ba38d375e777ba59da8e648bbadac5dcb6b60b2" exitCode=0 Apr 16 22:12:13.496537 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:13.496244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" event={"ID":"4bb42bb7-bd6c-45ec-b53b-90a39930e994","Type":"ContainerDied","Data":"2b4d8d50a48a38017f26dbe13ba38d375e777ba59da8e648bbadac5dcb6b60b2"} Apr 16 22:12:13.496537 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:13.496277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" event={"ID":"4bb42bb7-bd6c-45ec-b53b-90a39930e994","Type":"ContainerStarted","Data":"4de5ecff7d26ca0766b72c6286ffa3439d063e743ed2a591d45b4c1023750049"} Apr 16 22:12:13.499306 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:13.499278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" event={"ID":"42fa1204-d28a-413e-9cb1-ad8db42994af","Type":"ContainerStarted","Data":"cab4e54453002b632d296c6ef2fce4e645d9349b891fa18e9867d0e40ab254d4"} Apr 16 22:12:15.507549 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:15.507515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" event={"ID":"42fa1204-d28a-413e-9cb1-ad8db42994af","Type":"ContainerStarted","Data":"b5158db9ad14dd8959f82b16073cb231f06193cd29c613d3ba7cddf93636b988"} Apr 16 22:12:15.508044 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:15.507642 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:15.509096 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:15.509074 2576 generic.go:358] "Generic (PLEG): container finished" podID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerID="6cbc67f8fc3a1a8cf3ef76d78de61517e5635347764ef48f4221acf424830af9" exitCode=0 Apr 16 22:12:15.509195 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:15.509116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" event={"ID":"4bb42bb7-bd6c-45ec-b53b-90a39930e994","Type":"ContainerDied","Data":"6cbc67f8fc3a1a8cf3ef76d78de61517e5635347764ef48f4221acf424830af9"} Apr 16 22:12:15.531165 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:15.531115 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" podStartSLOduration=1.110858451 podStartE2EDuration="3.531098187s" podCreationTimestamp="2026-04-16 22:12:12 +0000 UTC" firstStartedPulling="2026-04-16 22:12:12.594092885 +0000 UTC m=+444.067471633" lastFinishedPulling="2026-04-16 22:12:15.014332621 +0000 UTC m=+446.487711369" observedRunningTime="2026-04-16 22:12:15.529107355 +0000 UTC m=+447.002486124" watchObservedRunningTime="2026-04-16 22:12:15.531098187 +0000 UTC m=+447.004476957" Apr 16 22:12:16.513863 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:16.513825 2576 generic.go:358] "Generic (PLEG): container finished" podID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerID="e876fed61ceaec87b1636438d9956cce20aeddeb08d5a62ba8b19a3749ef77ab" exitCode=0 Apr 16 22:12:16.514208 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:16.513864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" event={"ID":"4bb42bb7-bd6c-45ec-b53b-90a39930e994","Type":"ContainerDied","Data":"e876fed61ceaec87b1636438d9956cce20aeddeb08d5a62ba8b19a3749ef77ab"} Apr 16 22:12:17.632157 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.632135 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:17.647322 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.647297 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-util\") pod \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " Apr 16 22:12:17.647445 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.647355 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8lc9\" (UniqueName: \"kubernetes.io/projected/4bb42bb7-bd6c-45ec-b53b-90a39930e994-kube-api-access-p8lc9\") pod \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " Apr 16 22:12:17.647511 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.647451 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-bundle\") pod \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\" (UID: \"4bb42bb7-bd6c-45ec-b53b-90a39930e994\") " Apr 16 22:12:17.648676 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.648652 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-bundle" (OuterVolumeSpecName: "bundle") pod "4bb42bb7-bd6c-45ec-b53b-90a39930e994" (UID: "4bb42bb7-bd6c-45ec-b53b-90a39930e994"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:12:17.650766 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.650736 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb42bb7-bd6c-45ec-b53b-90a39930e994-kube-api-access-p8lc9" (OuterVolumeSpecName: "kube-api-access-p8lc9") pod "4bb42bb7-bd6c-45ec-b53b-90a39930e994" (UID: "4bb42bb7-bd6c-45ec-b53b-90a39930e994"). InnerVolumeSpecName "kube-api-access-p8lc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:12:17.653169 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.653146 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-util" (OuterVolumeSpecName: "util") pod "4bb42bb7-bd6c-45ec-b53b-90a39930e994" (UID: "4bb42bb7-bd6c-45ec-b53b-90a39930e994"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:12:17.748670 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.748636 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8lc9\" (UniqueName: \"kubernetes.io/projected/4bb42bb7-bd6c-45ec-b53b-90a39930e994-kube-api-access-p8lc9\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:17.748670 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.748666 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-bundle\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:17.748670 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:17.748677 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bb42bb7-bd6c-45ec-b53b-90a39930e994-util\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:18.522280 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:18.522239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" event={"ID":"4bb42bb7-bd6c-45ec-b53b-90a39930e994","Type":"ContainerDied","Data":"4de5ecff7d26ca0766b72c6286ffa3439d063e743ed2a591d45b4c1023750049"} Apr 16 22:12:18.522280 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:18.522270 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9mzq46" Apr 16 22:12:18.522280 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:18.522283 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de5ecff7d26ca0766b72c6286ffa3439d063e743ed2a591d45b4c1023750049" Apr 16 22:12:26.516368 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:26.516329 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-h657d" Apr 16 22:12:28.596243 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596203 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq"] Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596444 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="pull" Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596455 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="pull" Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596462 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="extract" Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596468 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="extract" Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596479 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="util" Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596485 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="util" Apr 16 22:12:28.596614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.596523 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bb42bb7-bd6c-45ec-b53b-90a39930e994" containerName="extract" Apr 16 22:12:28.604738 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.604717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.606892 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.606861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:12:28.607133 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.607110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq"] Apr 16 22:12:28.607591 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.607559 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:12:28.607746 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.607592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v84j8\"" Apr 16 22:12:28.619149 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.619123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvsz6\" (UniqueName: \"kubernetes.io/projected/4396e660-5933-449c-86ae-ded98fede16d-kube-api-access-qvsz6\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.619361 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.619301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.619465 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.619384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.720647 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.720608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.720856 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.720673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvsz6\" (UniqueName: \"kubernetes.io/projected/4396e660-5933-449c-86ae-ded98fede16d-kube-api-access-qvsz6\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.720856 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.720742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.721081 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.721058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.721127 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.721102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.728058 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.728034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvsz6\" (UniqueName: \"kubernetes.io/projected/4396e660-5933-449c-86ae-ded98fede16d-kube-api-access-qvsz6\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:28.914453 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:28.914373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:29.031930 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.031906 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq"] Apr 16 22:12:29.034441 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:29.034412 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4396e660_5933_449c_86ae_ded98fede16d.slice/crio-df7bed72fe9fe8ecd32475a22331832c229ba11c072fa4fb288878e86874e96e WatchSource:0}: Error finding container df7bed72fe9fe8ecd32475a22331832c229ba11c072fa4fb288878e86874e96e: Status 404 returned error can't find the container with id df7bed72fe9fe8ecd32475a22331832c229ba11c072fa4fb288878e86874e96e Apr 16 22:12:29.161223 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.161185 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd"] Apr 16 22:12:29.164373 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.164351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.166519 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.166456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 22:12:29.166637 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.166601 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:12:29.166726 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.166607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lrjtd\"" Apr 16 22:12:29.166726 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.166653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 22:12:29.166726 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.166656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:12:29.178249 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.175705 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd"] Apr 16 22:12:29.224393 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.224367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ed6f5be-2f34-44d2-b714-759092f582f3-tmp\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.224548 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.224409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9psm\" (UniqueName: \"kubernetes.io/projected/6ed6f5be-2f34-44d2-b714-759092f582f3-kube-api-access-t9psm\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.224548 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.224434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed6f5be-2f34-44d2-b714-759092f582f3-tls-certs\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.325526 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.325489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed6f5be-2f34-44d2-b714-759092f582f3-tls-certs\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.325709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.325547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ed6f5be-2f34-44d2-b714-759092f582f3-tmp\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.325709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.325569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9psm\" (UniqueName: \"kubernetes.io/projected/6ed6f5be-2f34-44d2-b714-759092f582f3-kube-api-access-t9psm\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.327941 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.327915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ed6f5be-2f34-44d2-b714-759092f582f3-tmp\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.328100 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.328084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed6f5be-2f34-44d2-b714-759092f582f3-tls-certs\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.332674 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.332650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9psm\" (UniqueName: \"kubernetes.io/projected/6ed6f5be-2f34-44d2-b714-759092f582f3-kube-api-access-t9psm\") pod \"kube-auth-proxy-6c9f6bcb5c-x6fqd\" (UID: \"6ed6f5be-2f34-44d2-b714-759092f582f3\") " pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.533118 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.533067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" Apr 16 22:12:29.562104 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.562072 2576 generic.go:358] "Generic (PLEG): container finished" podID="4396e660-5933-449c-86ae-ded98fede16d" containerID="756f7e5871c78c4dc3f927e4915feeaa02c2eb19d9b29c26bc67e29c7dab0d3d" exitCode=0 Apr 16 22:12:29.562253 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.562125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" event={"ID":"4396e660-5933-449c-86ae-ded98fede16d","Type":"ContainerDied","Data":"756f7e5871c78c4dc3f927e4915feeaa02c2eb19d9b29c26bc67e29c7dab0d3d"} Apr 16 22:12:29.562253 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.562148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" event={"ID":"4396e660-5933-449c-86ae-ded98fede16d","Type":"ContainerStarted","Data":"df7bed72fe9fe8ecd32475a22331832c229ba11c072fa4fb288878e86874e96e"} Apr 16 22:12:29.650505 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:29.650478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd"] Apr 16 22:12:29.653145 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:29.653122 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed6f5be_2f34_44d2_b714_759092f582f3.slice/crio-dab398be1e109712882207ee4cc5e972dc751f4fbef75b747645946719c40e53 WatchSource:0}: Error finding container dab398be1e109712882207ee4cc5e972dc751f4fbef75b747645946719c40e53: Status 404 returned error can't find the container with id dab398be1e109712882207ee4cc5e972dc751f4fbef75b747645946719c40e53 Apr 16 22:12:30.568722 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:30.568600 2576 generic.go:358] "Generic (PLEG): container finished" podID="4396e660-5933-449c-86ae-ded98fede16d" containerID="11c49469ad7951ce5ee6233198c44fdcf79d551ef8da790937195f9e79b66ce7" exitCode=0 Apr 16 22:12:30.568722 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:30.568645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" event={"ID":"4396e660-5933-449c-86ae-ded98fede16d","Type":"ContainerDied","Data":"11c49469ad7951ce5ee6233198c44fdcf79d551ef8da790937195f9e79b66ce7"} Apr 16 22:12:30.570545 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:30.570511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" event={"ID":"6ed6f5be-2f34-44d2-b714-759092f582f3","Type":"ContainerStarted","Data":"dab398be1e109712882207ee4cc5e972dc751f4fbef75b747645946719c40e53"} Apr 16 22:12:31.576683 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:31.576653 2576 generic.go:358] "Generic (PLEG): container finished" podID="4396e660-5933-449c-86ae-ded98fede16d" containerID="a85dccc9ea67e7e715ffa6411033ea494a6a361298faacbaa7a666fa3947119b" exitCode=0 Apr 16 22:12:31.577100 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:31.576739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" event={"ID":"4396e660-5933-449c-86ae-ded98fede16d","Type":"ContainerDied","Data":"a85dccc9ea67e7e715ffa6411033ea494a6a361298faacbaa7a666fa3947119b"} Apr 16 22:12:32.374869 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.374846 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-4zzf4"] Apr 16 22:12:32.379122 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.379104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:32.381040 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.380997 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 22:12:32.381323 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.381308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-5dfcg\"" Apr 16 22:12:32.387350 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.387328 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-4zzf4"] Apr 16 22:12:32.452545 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.452516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:32.452688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.452571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbt6\" (UniqueName: \"kubernetes.io/projected/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-kube-api-access-2jbt6\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:32.553509 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.553478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:32.553682 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.553531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbt6\" (UniqueName: \"kubernetes.io/projected/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-kube-api-access-2jbt6\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:32.553682 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:32.553625 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 22:12:32.553795 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:32.553712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert podName:1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a nodeName:}" failed. No retries permitted until 2026-04-16 22:12:33.053670485 +0000 UTC m=+464.527049235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert") pod "odh-model-controller-858dbf95b8-4zzf4" (UID: "1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a") : secret "odh-model-controller-webhook-cert" not found Apr 16 22:12:32.561330 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.561307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbt6\" (UniqueName: \"kubernetes.io/projected/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-kube-api-access-2jbt6\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:32.580752 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.580668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" event={"ID":"6ed6f5be-2f34-44d2-b714-759092f582f3","Type":"ContainerStarted","Data":"8b4ec4e58f0d2158dc454887c17bfd477b1a1ec523df21fce437488b2604860f"} Apr 16 22:12:32.594850 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.594798 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6c9f6bcb5c-x6fqd" podStartSLOduration=0.921646605 podStartE2EDuration="3.594784008s" podCreationTimestamp="2026-04-16 22:12:29 +0000 UTC" firstStartedPulling="2026-04-16 22:12:29.654845156 +0000 UTC m=+461.128223917" lastFinishedPulling="2026-04-16 22:12:32.327982573 +0000 UTC m=+463.801361320" observedRunningTime="2026-04-16 22:12:32.594555891 +0000 UTC m=+464.067934663" watchObservedRunningTime="2026-04-16 22:12:32.594784008 +0000 UTC m=+464.068162777" Apr 16 22:12:32.711978 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.711954 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:32.755192 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.755165 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvsz6\" (UniqueName: \"kubernetes.io/projected/4396e660-5933-449c-86ae-ded98fede16d-kube-api-access-qvsz6\") pod \"4396e660-5933-449c-86ae-ded98fede16d\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " Apr 16 22:12:32.755351 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.755239 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-bundle\") pod \"4396e660-5933-449c-86ae-ded98fede16d\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " Apr 16 22:12:32.755433 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.755414 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-util\") pod \"4396e660-5933-449c-86ae-ded98fede16d\" (UID: \"4396e660-5933-449c-86ae-ded98fede16d\") " Apr 16 22:12:32.756083 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.756062 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-bundle" (OuterVolumeSpecName: "bundle") pod "4396e660-5933-449c-86ae-ded98fede16d" (UID: "4396e660-5933-449c-86ae-ded98fede16d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:12:32.757221 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.757200 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4396e660-5933-449c-86ae-ded98fede16d-kube-api-access-qvsz6" (OuterVolumeSpecName: "kube-api-access-qvsz6") pod "4396e660-5933-449c-86ae-ded98fede16d" (UID: "4396e660-5933-449c-86ae-ded98fede16d"). InnerVolumeSpecName "kube-api-access-qvsz6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:12:32.761431 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.761402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-util" (OuterVolumeSpecName: "util") pod "4396e660-5933-449c-86ae-ded98fede16d" (UID: "4396e660-5933-449c-86ae-ded98fede16d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:12:32.856780 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.856684 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvsz6\" (UniqueName: \"kubernetes.io/projected/4396e660-5933-449c-86ae-ded98fede16d-kube-api-access-qvsz6\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:32.856780 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.856729 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-bundle\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:32.856780 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:32.856738 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4396e660-5933-449c-86ae-ded98fede16d-util\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:33.058958 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:33.058923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:33.059121 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:33.059040 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 22:12:33.059121 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:33.059095 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert podName:1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a nodeName:}" failed. No retries permitted until 2026-04-16 22:12:34.059079126 +0000 UTC m=+465.532457874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert") pod "odh-model-controller-858dbf95b8-4zzf4" (UID: "1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a") : secret "odh-model-controller-webhook-cert" not found Apr 16 22:12:33.585428 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:33.585397 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" Apr 16 22:12:33.585782 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:33.585396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48354k7zq" event={"ID":"4396e660-5933-449c-86ae-ded98fede16d","Type":"ContainerDied","Data":"df7bed72fe9fe8ecd32475a22331832c229ba11c072fa4fb288878e86874e96e"} Apr 16 22:12:33.585782 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:33.585504 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df7bed72fe9fe8ecd32475a22331832c229ba11c072fa4fb288878e86874e96e" Apr 16 22:12:34.068220 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:34.068176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:34.070688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:34.070665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a-cert\") pod \"odh-model-controller-858dbf95b8-4zzf4\" (UID: \"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a\") " pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:34.189892 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:34.189846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:34.311969 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:34.311938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-4zzf4"] Apr 16 22:12:34.314824 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:34.314791 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f4e3ca8_23e2_4f21_95c4_8e25b4582c9a.slice/crio-bc75675e8bc893eac0fc27c408291d927b50fc649177f86155bb92b21ded0cfb WatchSource:0}: Error finding container bc75675e8bc893eac0fc27c408291d927b50fc649177f86155bb92b21ded0cfb: Status 404 returned error can't find the container with id bc75675e8bc893eac0fc27c408291d927b50fc649177f86155bb92b21ded0cfb Apr 16 22:12:34.589680 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:34.589643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" event={"ID":"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a","Type":"ContainerStarted","Data":"bc75675e8bc893eac0fc27c408291d927b50fc649177f86155bb92b21ded0cfb"} Apr 16 22:12:37.602873 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:37.602838 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a" containerID="741e93f8d65b46e14c5bd4f001ff816924666e02178d7be1d6f2a3d5c885fd7d" exitCode=1 Apr 16 22:12:37.603228 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:37.602919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" event={"ID":"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a","Type":"ContainerDied","Data":"741e93f8d65b46e14c5bd4f001ff816924666e02178d7be1d6f2a3d5c885fd7d"} Apr 16 22:12:37.603228 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:37.603107 2576 scope.go:117] "RemoveContainer" containerID="741e93f8d65b46e14c5bd4f001ff816924666e02178d7be1d6f2a3d5c885fd7d" Apr 16 22:12:38.314499 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314467 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-l8fml"] Apr 16 22:12:38.314773 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314761 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="extract" Apr 16 22:12:38.314823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314775 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="extract" Apr 16 22:12:38.314823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314791 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="util" Apr 16 22:12:38.314823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314797 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="util" Apr 16 22:12:38.314823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314803 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="pull" Apr 16 22:12:38.314823 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314809 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="pull" Apr 16 22:12:38.314967 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.314852 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4396e660-5933-449c-86ae-ded98fede16d" containerName="extract" Apr 16 22:12:38.317534 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.317518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:38.319811 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.319786 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 22:12:38.319921 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.319814 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-nmfpn\"" Apr 16 22:12:38.330062 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.330039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-l8fml"] Apr 16 22:12:38.403020 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.402990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2qnc\" (UniqueName: \"kubernetes.io/projected/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-kube-api-access-w2qnc\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:38.403183 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.403042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:38.504475 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.504443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2qnc\" (UniqueName: \"kubernetes.io/projected/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-kube-api-access-w2qnc\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:38.504642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.504519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:38.504730 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:38.504644 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 22:12:38.504791 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:38.504741 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert podName:1d3b2559-e6ac-4fd3-ab02-a0e626613f92 nodeName:}" failed. No retries permitted until 2026-04-16 22:12:39.004718042 +0000 UTC m=+470.478096797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert") pod "kserve-controller-manager-856948b99f-l8fml" (UID: "1d3b2559-e6ac-4fd3-ab02-a0e626613f92") : secret "kserve-webhook-server-cert" not found Apr 16 22:12:38.514048 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.514018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2qnc\" (UniqueName: \"kubernetes.io/projected/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-kube-api-access-w2qnc\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:38.608129 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.608036 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a" containerID="90cae542c18125ba7f81f54439c2e386a1071b0e7e1da09f29a2b064cefbd3a2" exitCode=1 Apr 16 22:12:38.608129 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.608093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" event={"ID":"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a","Type":"ContainerDied","Data":"90cae542c18125ba7f81f54439c2e386a1071b0e7e1da09f29a2b064cefbd3a2"} Apr 16 22:12:38.608129 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.608124 2576 scope.go:117] "RemoveContainer" containerID="741e93f8d65b46e14c5bd4f001ff816924666e02178d7be1d6f2a3d5c885fd7d" Apr 16 22:12:38.608651 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:38.608367 2576 scope.go:117] "RemoveContainer" containerID="90cae542c18125ba7f81f54439c2e386a1071b0e7e1da09f29a2b064cefbd3a2" Apr 16 22:12:38.608651 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:38.608572 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-4zzf4_opendatahub(1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a)\"" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" podUID="1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a" Apr 16 22:12:39.008069 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:39.008033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:39.008208 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:39.008184 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 22:12:39.008266 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:39.008254 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert podName:1d3b2559-e6ac-4fd3-ab02-a0e626613f92 nodeName:}" failed. No retries permitted until 2026-04-16 22:12:40.008236758 +0000 UTC m=+471.481615506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert") pod "kserve-controller-manager-856948b99f-l8fml" (UID: "1d3b2559-e6ac-4fd3-ab02-a0e626613f92") : secret "kserve-webhook-server-cert" not found Apr 16 22:12:39.613285 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:39.613251 2576 scope.go:117] "RemoveContainer" containerID="90cae542c18125ba7f81f54439c2e386a1071b0e7e1da09f29a2b064cefbd3a2" Apr 16 22:12:39.613731 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:39.613471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-4zzf4_opendatahub(1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a)\"" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" podUID="1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a" Apr 16 22:12:40.014708 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:40.014668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:40.017023 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:40.017003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3b2559-e6ac-4fd3-ab02-a0e626613f92-cert\") pod \"kserve-controller-manager-856948b99f-l8fml\" (UID: \"1d3b2559-e6ac-4fd3-ab02-a0e626613f92\") " pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:40.127983 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:40.127949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:40.248196 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:40.248128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-l8fml"] Apr 16 22:12:40.250277 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:40.250244 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3b2559_e6ac_4fd3_ab02_a0e626613f92.slice/crio-f951ed24efa69bd6ddf7a800effd6aa37d9175e798be7b0f47916b1da0640a12 WatchSource:0}: Error finding container f951ed24efa69bd6ddf7a800effd6aa37d9175e798be7b0f47916b1da0640a12: Status 404 returned error can't find the container with id f951ed24efa69bd6ddf7a800effd6aa37d9175e798be7b0f47916b1da0640a12 Apr 16 22:12:40.617597 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:40.617559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" event={"ID":"1d3b2559-e6ac-4fd3-ab02-a0e626613f92","Type":"ContainerStarted","Data":"f951ed24efa69bd6ddf7a800effd6aa37d9175e798be7b0f47916b1da0640a12"} Apr 16 22:12:42.179640 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.179591 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2"] Apr 16 22:12:42.184105 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.184085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.187178 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.187152 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:12:42.188078 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.188044 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:12:42.188078 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.188043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v84j8\"" Apr 16 22:12:42.193328 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.193306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2"] Apr 16 22:12:42.233398 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.233370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.233585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.233426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.233585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.233469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77c5\" (UniqueName: \"kubernetes.io/projected/50d97476-877e-4dfa-8661-1a70e55ca009-kube-api-access-k77c5\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.334611 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.334569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.334782 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.334630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k77c5\" (UniqueName: \"kubernetes.io/projected/50d97476-877e-4dfa-8661-1a70e55ca009-kube-api-access-k77c5\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.334782 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.334717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.335021 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.334998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.335109 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.335063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.343902 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.343856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77c5\" (UniqueName: \"kubernetes.io/projected/50d97476-877e-4dfa-8661-1a70e55ca009-kube-api-access-k77c5\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.496105 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.496077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:42.625781 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.625738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" event={"ID":"1d3b2559-e6ac-4fd3-ab02-a0e626613f92","Type":"ContainerStarted","Data":"88a2b878ac515789369f6516b5eabdbf2283ee81709412db3b063d243a82c1ea"} Apr 16 22:12:42.626008 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.625990 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:12:42.626665 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.626648 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2"] Apr 16 22:12:42.628850 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:42.628824 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d97476_877e_4dfa_8661_1a70e55ca009.slice/crio-444d0494a16cb210b24e1d4f885cbad67b882f76cee134e199a66fcc1356bff0 WatchSource:0}: Error finding container 444d0494a16cb210b24e1d4f885cbad67b882f76cee134e199a66fcc1356bff0: Status 404 returned error can't find the container with id 444d0494a16cb210b24e1d4f885cbad67b882f76cee134e199a66fcc1356bff0 Apr 16 22:12:42.649826 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:42.649786 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" podStartSLOduration=2.476429347 podStartE2EDuration="4.649769185s" podCreationTimestamp="2026-04-16 22:12:38 +0000 UTC" firstStartedPulling="2026-04-16 22:12:40.251726864 +0000 UTC m=+471.725105613" lastFinishedPulling="2026-04-16 22:12:42.425066699 +0000 UTC m=+473.898445451" observedRunningTime="2026-04-16 22:12:42.647777373 +0000 UTC m=+474.121156142" watchObservedRunningTime="2026-04-16 22:12:42.649769185 +0000 UTC m=+474.123147955" Apr 16 22:12:43.536381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.536345 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8"] Apr 16 22:12:43.539499 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.539482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.542307 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.542282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 22:12:43.543479 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.543243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 22:12:43.543479 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.543361 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-d7brv\"" Apr 16 22:12:43.556649 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.556623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8"] Apr 16 22:12:43.630449 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.630404 2576 generic.go:358] "Generic (PLEG): container finished" podID="50d97476-877e-4dfa-8661-1a70e55ca009" containerID="6908c2f0cb41d2450da504b09c75a25f1d17900b588b4e1a522cb37da218b083" exitCode=0 Apr 16 22:12:43.630614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.630500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" event={"ID":"50d97476-877e-4dfa-8661-1a70e55ca009","Type":"ContainerDied","Data":"6908c2f0cb41d2450da504b09c75a25f1d17900b588b4e1a522cb37da218b083"} Apr 16 22:12:43.630614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.630538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" event={"ID":"50d97476-877e-4dfa-8661-1a70e55ca009","Type":"ContainerStarted","Data":"444d0494a16cb210b24e1d4f885cbad67b882f76cee134e199a66fcc1356bff0"} Apr 16 22:12:43.642367 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.642344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a1978b63-d8b7-4d7e-b9dd-19260e7b4b14-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vmzq8\" (UID: \"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.642469 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.642383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l79qm\" (UniqueName: \"kubernetes.io/projected/a1978b63-d8b7-4d7e-b9dd-19260e7b4b14-kube-api-access-l79qm\") pod \"servicemesh-operator3-55f49c5f94-vmzq8\" (UID: \"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.743458 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.743423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a1978b63-d8b7-4d7e-b9dd-19260e7b4b14-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vmzq8\" (UID: \"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.743458 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.743471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l79qm\" (UniqueName: \"kubernetes.io/projected/a1978b63-d8b7-4d7e-b9dd-19260e7b4b14-kube-api-access-l79qm\") pod \"servicemesh-operator3-55f49c5f94-vmzq8\" (UID: \"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.746340 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.746306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a1978b63-d8b7-4d7e-b9dd-19260e7b4b14-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vmzq8\" (UID: \"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.758090 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.758063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l79qm\" (UniqueName: \"kubernetes.io/projected/a1978b63-d8b7-4d7e-b9dd-19260e7b4b14-kube-api-access-l79qm\") pod \"servicemesh-operator3-55f49c5f94-vmzq8\" (UID: \"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.853026 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.852907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:43.972462 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:43.972437 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8"] Apr 16 22:12:43.975251 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:43.975224 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1978b63_d8b7_4d7e_b9dd_19260e7b4b14.slice/crio-7aa674df2faeb3ccb7ce98e8d984ca38d28c792a48d2d9f5163fd9451c8f020a WatchSource:0}: Error finding container 7aa674df2faeb3ccb7ce98e8d984ca38d28c792a48d2d9f5163fd9451c8f020a: Status 404 returned error can't find the container with id 7aa674df2faeb3ccb7ce98e8d984ca38d28c792a48d2d9f5163fd9451c8f020a Apr 16 22:12:44.190438 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:44.190411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:44.190833 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:44.190821 2576 scope.go:117] "RemoveContainer" containerID="90cae542c18125ba7f81f54439c2e386a1071b0e7e1da09f29a2b064cefbd3a2" Apr 16 22:12:44.190996 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:12:44.190980 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-4zzf4_opendatahub(1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a)\"" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" podUID="1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a" Apr 16 22:12:44.637564 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:44.637484 2576 generic.go:358] "Generic (PLEG): container finished" podID="50d97476-877e-4dfa-8661-1a70e55ca009" containerID="6c5cd00816c880355bbe65b38c34d08645f06057a437e5364cc27520ccaf6478" exitCode=0 Apr 16 22:12:44.638032 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:44.637573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" event={"ID":"50d97476-877e-4dfa-8661-1a70e55ca009","Type":"ContainerDied","Data":"6c5cd00816c880355bbe65b38c34d08645f06057a437e5364cc27520ccaf6478"} Apr 16 22:12:44.639303 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:44.639268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" event={"ID":"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14","Type":"ContainerStarted","Data":"7aa674df2faeb3ccb7ce98e8d984ca38d28c792a48d2d9f5163fd9451c8f020a"} Apr 16 22:12:45.644842 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:45.644810 2576 generic.go:358] "Generic (PLEG): container finished" podID="50d97476-877e-4dfa-8661-1a70e55ca009" containerID="641f943a42986dfee603979e47bbded6952cfd185e3544b763cae85981047ea6" exitCode=0 Apr 16 22:12:45.645255 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:45.644895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" event={"ID":"50d97476-877e-4dfa-8661-1a70e55ca009","Type":"ContainerDied","Data":"641f943a42986dfee603979e47bbded6952cfd185e3544b763cae85981047ea6"} Apr 16 22:12:46.650639 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.650599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" event={"ID":"a1978b63-d8b7-4d7e-b9dd-19260e7b4b14","Type":"ContainerStarted","Data":"c38df2955d138acc8cdc3f342daae63c5404d85adda99da4c7eec0bc9986b8b3"} Apr 16 22:12:46.651093 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.650803 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:46.672161 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.671901 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" podStartSLOduration=1.195584918 podStartE2EDuration="3.671883436s" podCreationTimestamp="2026-04-16 22:12:43 +0000 UTC" firstStartedPulling="2026-04-16 22:12:43.977798309 +0000 UTC m=+475.451177056" lastFinishedPulling="2026-04-16 22:12:46.454096818 +0000 UTC m=+477.927475574" observedRunningTime="2026-04-16 22:12:46.669196396 +0000 UTC m=+478.142575167" watchObservedRunningTime="2026-04-16 22:12:46.671883436 +0000 UTC m=+478.145262208" Apr 16 22:12:46.810965 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.810943 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:46.869387 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.869354 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-util\") pod \"50d97476-877e-4dfa-8661-1a70e55ca009\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " Apr 16 22:12:46.869530 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.869395 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k77c5\" (UniqueName: \"kubernetes.io/projected/50d97476-877e-4dfa-8661-1a70e55ca009-kube-api-access-k77c5\") pod \"50d97476-877e-4dfa-8661-1a70e55ca009\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " Apr 16 22:12:46.869530 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.869426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-bundle\") pod \"50d97476-877e-4dfa-8661-1a70e55ca009\" (UID: \"50d97476-877e-4dfa-8661-1a70e55ca009\") " Apr 16 22:12:46.870329 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.870303 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-bundle" (OuterVolumeSpecName: "bundle") pod "50d97476-877e-4dfa-8661-1a70e55ca009" (UID: "50d97476-877e-4dfa-8661-1a70e55ca009"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:12:46.871460 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.871433 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d97476-877e-4dfa-8661-1a70e55ca009-kube-api-access-k77c5" (OuterVolumeSpecName: "kube-api-access-k77c5") pod "50d97476-877e-4dfa-8661-1a70e55ca009" (UID: "50d97476-877e-4dfa-8661-1a70e55ca009"). InnerVolumeSpecName "kube-api-access-k77c5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:12:46.874814 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.874790 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-util" (OuterVolumeSpecName: "util") pod "50d97476-877e-4dfa-8661-1a70e55ca009" (UID: "50d97476-877e-4dfa-8661-1a70e55ca009"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:12:46.970292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.970209 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-util\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:46.970292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.970238 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k77c5\" (UniqueName: \"kubernetes.io/projected/50d97476-877e-4dfa-8661-1a70e55ca009-kube-api-access-k77c5\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:46.970292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:46.970247 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50d97476-877e-4dfa-8661-1a70e55ca009-bundle\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:12:47.655777 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:47.655753 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" Apr 16 22:12:47.655777 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:47.655753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c25jfb2" event={"ID":"50d97476-877e-4dfa-8661-1a70e55ca009","Type":"ContainerDied","Data":"444d0494a16cb210b24e1d4f885cbad67b882f76cee134e199a66fcc1356bff0"} Apr 16 22:12:47.656211 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:47.655793 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="444d0494a16cb210b24e1d4f885cbad67b882f76cee134e199a66fcc1356bff0" Apr 16 22:12:54.190529 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:54.190492 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:54.190914 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:54.190871 2576 scope.go:117] "RemoveContainer" containerID="90cae542c18125ba7f81f54439c2e386a1071b0e7e1da09f29a2b064cefbd3a2" Apr 16 22:12:54.679881 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:54.679847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" event={"ID":"1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a","Type":"ContainerStarted","Data":"bd955b7c345d5fe72a4e07879c175d3953cbf1533dc9272e19827b751143fad0"} Apr 16 22:12:54.680061 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:54.680045 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:12:54.698432 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:54.698387 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" podStartSLOduration=2.508168074 podStartE2EDuration="22.698374638s" podCreationTimestamp="2026-04-16 22:12:32 +0000 UTC" firstStartedPulling="2026-04-16 22:12:34.316177608 +0000 UTC m=+465.789556357" lastFinishedPulling="2026-04-16 22:12:54.50638416 +0000 UTC m=+485.979762921" observedRunningTime="2026-04-16 22:12:54.696201033 +0000 UTC m=+486.169579804" watchObservedRunningTime="2026-04-16 22:12:54.698374638 +0000 UTC m=+486.171753408" Apr 16 22:12:57.659332 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:57.659301 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vmzq8" Apr 16 22:12:58.826604 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826568 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk"] Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826879 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="pull" Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826890 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="pull" Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826906 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="extract" Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826912 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="extract" Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826923 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="util" Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826929 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="util" Apr 16 22:12:58.826991 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.826972 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="50d97476-877e-4dfa-8661-1a70e55ca009" containerName="extract" Apr 16 22:12:58.830097 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.830077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.833283 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.833261 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 22:12:58.833431 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.833411 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 22:12:58.834887 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.834867 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 22:12:58.834887 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.834877 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-8d5lm\"" Apr 16 22:12:58.835715 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.835683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 22:12:58.856431 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.856398 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk"] Apr 16 22:12:58.963436 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/545b828a-0345-4b9d-a2d0-2f95cbc996d7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.963436 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l955k\" (UniqueName: \"kubernetes.io/projected/545b828a-0345-4b9d-a2d0-2f95cbc996d7-kube-api-access-l955k\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.963642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.963642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.963642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.963642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:58.963642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:58.963640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.064813 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/545b828a-0345-4b9d-a2d0-2f95cbc996d7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.065009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l955k\" (UniqueName: \"kubernetes.io/projected/545b828a-0345-4b9d-a2d0-2f95cbc996d7-kube-api-access-l955k\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.065009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.065009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.065009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.065009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.065009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.064969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.066460 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.066423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.068607 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.068577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.068957 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.068927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/545b828a-0345-4b9d-a2d0-2f95cbc996d7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.069146 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.069119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.069897 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.069872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.084547 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.084488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/545b828a-0345-4b9d-a2d0-2f95cbc996d7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.084842 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.084820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l955k\" (UniqueName: \"kubernetes.io/projected/545b828a-0345-4b9d-a2d0-2f95cbc996d7-kube-api-access-l955k\") pod \"istiod-openshift-gateway-55ff986f96-f6qtk\" (UID: \"545b828a-0345-4b9d-a2d0-2f95cbc996d7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.139366 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.139334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:12:59.283110 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.283081 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk"] Apr 16 22:12:59.285969 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:12:59.285943 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545b828a_0345_4b9d_a2d0_2f95cbc996d7.slice/crio-acc164d9e5ba8d3bb45bd4447fbad69f8394d1636a863e9eb8510ccae5bba624 WatchSource:0}: Error finding container acc164d9e5ba8d3bb45bd4447fbad69f8394d1636a863e9eb8510ccae5bba624: Status 404 returned error can't find the container with id acc164d9e5ba8d3bb45bd4447fbad69f8394d1636a863e9eb8510ccae5bba624 Apr 16 22:12:59.698008 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:12:59.697969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" event={"ID":"545b828a-0345-4b9d-a2d0-2f95cbc996d7","Type":"ContainerStarted","Data":"acc164d9e5ba8d3bb45bd4447fbad69f8394d1636a863e9eb8510ccae5bba624"} Apr 16 22:13:02.189117 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:02.189076 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 22:13:02.189363 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:02.189176 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 22:13:02.709900 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:02.709867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" event={"ID":"545b828a-0345-4b9d-a2d0-2f95cbc996d7","Type":"ContainerStarted","Data":"76901a76caec7843d84bb65feca6652c8bfcd3de4ba1dc0654a3fd70676f9cd5"} Apr 16 22:13:02.710099 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:02.709970 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:13:02.729762 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:02.729673 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" podStartSLOduration=1.828752781 podStartE2EDuration="4.729651366s" podCreationTimestamp="2026-04-16 22:12:58 +0000 UTC" firstStartedPulling="2026-04-16 22:12:59.287931818 +0000 UTC m=+490.761310581" lastFinishedPulling="2026-04-16 22:13:02.188830416 +0000 UTC m=+493.662209166" observedRunningTime="2026-04-16 22:13:02.727403876 +0000 UTC m=+494.200782648" watchObservedRunningTime="2026-04-16 22:13:02.729651366 +0000 UTC m=+494.203030137" Apr 16 22:13:03.714756 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:03.714726 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-f6qtk" Apr 16 22:13:05.685635 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:05.685605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-4zzf4" Apr 16 22:13:13.635474 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:13:13.635445 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-l8fml" Apr 16 22:14:05.198593 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.198513 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z"] Apr 16 22:14:05.200568 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.200552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:05.202685 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.202661 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 22:14:05.202975 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.202748 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 22:14:05.203357 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.203339 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-kbskg\"" Apr 16 22:14:05.210811 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.210790 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z"] Apr 16 22:14:05.284755 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.284718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhnq\" (UniqueName: \"kubernetes.io/projected/351b7139-ad12-4a3b-a370-271268b6e9aa-kube-api-access-dbhnq\") pod \"limitador-operator-controller-manager-85c4996f8c-5l62z\" (UID: \"351b7139-ad12-4a3b-a370-271268b6e9aa\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:05.385642 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.385608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhnq\" (UniqueName: \"kubernetes.io/projected/351b7139-ad12-4a3b-a370-271268b6e9aa-kube-api-access-dbhnq\") pod \"limitador-operator-controller-manager-85c4996f8c-5l62z\" (UID: \"351b7139-ad12-4a3b-a370-271268b6e9aa\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:05.395788 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.395760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhnq\" (UniqueName: \"kubernetes.io/projected/351b7139-ad12-4a3b-a370-271268b6e9aa-kube-api-access-dbhnq\") pod \"limitador-operator-controller-manager-85c4996f8c-5l62z\" (UID: \"351b7139-ad12-4a3b-a370-271268b6e9aa\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:05.511323 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.511213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:05.645509 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.645483 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z"] Apr 16 22:14:05.648520 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:14:05.648493 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351b7139_ad12_4a3b_a370_271268b6e9aa.slice/crio-90f68b2d7acf6eb6d0453fe9d56298606d72729a8f31127e1e8a402a07c47a0d WatchSource:0}: Error finding container 90f68b2d7acf6eb6d0453fe9d56298606d72729a8f31127e1e8a402a07c47a0d: Status 404 returned error can't find the container with id 90f68b2d7acf6eb6d0453fe9d56298606d72729a8f31127e1e8a402a07c47a0d Apr 16 22:14:05.915423 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:05.915335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" event={"ID":"351b7139-ad12-4a3b-a370-271268b6e9aa","Type":"ContainerStarted","Data":"90f68b2d7acf6eb6d0453fe9d56298606d72729a8f31127e1e8a402a07c47a0d"} Apr 16 22:14:07.929009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:07.928918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" event={"ID":"351b7139-ad12-4a3b-a370-271268b6e9aa","Type":"ContainerStarted","Data":"3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a"} Apr 16 22:14:07.929376 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:07.929173 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:07.944899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:07.944853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" podStartSLOduration=1.121950819 podStartE2EDuration="2.944840763s" podCreationTimestamp="2026-04-16 22:14:05 +0000 UTC" firstStartedPulling="2026-04-16 22:14:05.650877322 +0000 UTC m=+557.124256084" lastFinishedPulling="2026-04-16 22:14:07.47376728 +0000 UTC m=+558.947146028" observedRunningTime="2026-04-16 22:14:07.94333335 +0000 UTC m=+559.416712120" watchObservedRunningTime="2026-04-16 22:14:07.944840763 +0000 UTC m=+559.418219533" Apr 16 22:14:18.934446 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:18.934412 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:19.982242 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:19.982206 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z"] Apr 16 22:14:19.982729 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:19.982446 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" containerName="manager" containerID="cri-o://3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a" gracePeriod=2 Apr 16 22:14:19.984291 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:19.984233 2576 status_manager.go:895] "Failed to get status for pod" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" err="pods \"limitador-operator-controller-manager-85c4996f8c-5l62z\" is forbidden: User \"system:node:ip-10-0-129-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-68.ec2.internal' and this object" Apr 16 22:14:19.986428 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:19.986400 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z"] Apr 16 22:14:20.012537 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.012508 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465"] Apr 16 22:14:20.012860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.012843 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" containerName="manager" Apr 16 22:14:20.012860 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.012859 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" containerName="manager" Apr 16 22:14:20.013005 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.012916 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" containerName="manager" Apr 16 22:14:20.014601 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.014584 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:20.017736 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.017684 2576 status_manager.go:895] "Failed to get status for pod" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" err="pods \"limitador-operator-controller-manager-85c4996f8c-5l62z\" is forbidden: User \"system:node:ip-10-0-129-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-68.ec2.internal' and this object" Apr 16 22:14:20.034641 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.034614 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465"] Apr 16 22:14:20.110559 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.110530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9lz2\" (UniqueName: \"kubernetes.io/projected/49a88115-b1b6-4e3a-b806-2810565bea72-kube-api-access-j9lz2\") pod \"limitador-operator-controller-manager-85c4996f8c-p2465\" (UID: \"49a88115-b1b6-4e3a-b806-2810565bea72\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:20.209705 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.209672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:20.210931 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.210909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9lz2\" (UniqueName: \"kubernetes.io/projected/49a88115-b1b6-4e3a-b806-2810565bea72-kube-api-access-j9lz2\") pod \"limitador-operator-controller-manager-85c4996f8c-p2465\" (UID: \"49a88115-b1b6-4e3a-b806-2810565bea72\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:20.211920 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.211894 2576 status_manager.go:895] "Failed to get status for pod" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" err="pods \"limitador-operator-controller-manager-85c4996f8c-5l62z\" is forbidden: User \"system:node:ip-10-0-129-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-68.ec2.internal' and this object" Apr 16 22:14:20.219435 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.219408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9lz2\" (UniqueName: \"kubernetes.io/projected/49a88115-b1b6-4e3a-b806-2810565bea72-kube-api-access-j9lz2\") pod \"limitador-operator-controller-manager-85c4996f8c-p2465\" (UID: \"49a88115-b1b6-4e3a-b806-2810565bea72\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:20.311446 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.311359 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhnq\" (UniqueName: \"kubernetes.io/projected/351b7139-ad12-4a3b-a370-271268b6e9aa-kube-api-access-dbhnq\") pod \"351b7139-ad12-4a3b-a370-271268b6e9aa\" (UID: \"351b7139-ad12-4a3b-a370-271268b6e9aa\") " Apr 16 22:14:20.313512 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.313484 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351b7139-ad12-4a3b-a370-271268b6e9aa-kube-api-access-dbhnq" (OuterVolumeSpecName: "kube-api-access-dbhnq") pod "351b7139-ad12-4a3b-a370-271268b6e9aa" (UID: "351b7139-ad12-4a3b-a370-271268b6e9aa"). InnerVolumeSpecName "kube-api-access-dbhnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:14:20.354600 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.354565 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:20.412773 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.412741 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dbhnq\" (UniqueName: \"kubernetes.io/projected/351b7139-ad12-4a3b-a370-271268b6e9aa-kube-api-access-dbhnq\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:14:20.478795 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.478771 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465"] Apr 16 22:14:20.480935 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:14:20.480908 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a88115_b1b6_4e3a_b806_2810565bea72.slice/crio-7a20fa3b55e9228eb80daf497948aba08d18a15b73d1f8366a1f979054cd1449 WatchSource:0}: Error finding container 7a20fa3b55e9228eb80daf497948aba08d18a15b73d1f8366a1f979054cd1449: Status 404 returned error can't find the container with id 7a20fa3b55e9228eb80daf497948aba08d18a15b73d1f8366a1f979054cd1449 Apr 16 22:14:20.978113 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.978079 2576 generic.go:358] "Generic (PLEG): container finished" podID="351b7139-ad12-4a3b-a370-271268b6e9aa" containerID="3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a" exitCode=0 Apr 16 22:14:20.978307 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.978127 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" Apr 16 22:14:20.978307 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.978186 2576 scope.go:117] "RemoveContainer" containerID="3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a" Apr 16 22:14:20.979630 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.979606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" event={"ID":"49a88115-b1b6-4e3a-b806-2810565bea72","Type":"ContainerStarted","Data":"c316e018411e00e92b3d6cde64827d05fc6cf78d74f457b04501201835a5ea35"} Apr 16 22:14:20.979729 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.979637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" event={"ID":"49a88115-b1b6-4e3a-b806-2810565bea72","Type":"ContainerStarted","Data":"7a20fa3b55e9228eb80daf497948aba08d18a15b73d1f8366a1f979054cd1449"} Apr 16 22:14:20.979820 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.979732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:20.980598 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.980579 2576 status_manager.go:895] "Failed to get status for pod" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" err="pods \"limitador-operator-controller-manager-85c4996f8c-5l62z\" is forbidden: User \"system:node:ip-10-0-129-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-68.ec2.internal' and this object" Apr 16 22:14:20.982319 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.982291 2576 status_manager.go:895] "Failed to get status for pod" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" err="pods \"limitador-operator-controller-manager-85c4996f8c-5l62z\" is forbidden: User \"system:node:ip-10-0-129-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-68.ec2.internal' and this object" Apr 16 22:14:20.986756 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.986584 2576 scope.go:117] "RemoveContainer" containerID="3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a" Apr 16 22:14:20.986872 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:14:20.986853 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a\": container with ID starting with 3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a not found: ID does not exist" containerID="3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a" Apr 16 22:14:20.986910 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:20.986883 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a"} err="failed to get container status \"3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a\": rpc error: code = NotFound desc = could not find container \"3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a\": container with ID starting with 3e5080fb7d75b52e293d11c3b40f13603735735419e146c295b005b035cf249a not found: ID does not exist" Apr 16 22:14:21.001806 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:21.001761 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" podStartSLOduration=2.001747628 podStartE2EDuration="2.001747628s" podCreationTimestamp="2026-04-16 22:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:21.000383671 +0000 UTC m=+572.473762461" watchObservedRunningTime="2026-04-16 22:14:21.001747628 +0000 UTC m=+572.475126399" Apr 16 22:14:21.002175 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:21.002151 2576 status_manager.go:895] "Failed to get status for pod" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5l62z" err="pods \"limitador-operator-controller-manager-85c4996f8c-5l62z\" is forbidden: User \"system:node:ip-10-0-129-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-68.ec2.internal' and this object" Apr 16 22:14:21.137306 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:21.137262 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351b7139-ad12-4a3b-a370-271268b6e9aa" path="/var/lib/kubelet/pods/351b7139-ad12-4a3b-a370-271268b6e9aa/volumes" Apr 16 22:14:31.987200 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:31.987170 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p2465" Apr 16 22:14:49.030226 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:49.030193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:14:49.030935 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:14:49.030916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:15:06.891047 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.891009 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:15:06.894129 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.894114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:06.896241 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.896221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xfwsw\"" Apr 16 22:15:06.896364 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.896221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 22:15:06.901450 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.901352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:15:06.930704 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.930664 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:15:06.979671 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.979628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdx6\" (UniqueName: \"kubernetes.io/projected/399be673-c9fa-4c73-9706-3c82ffd26a3b-kube-api-access-ccdx6\") pod \"limitador-limitador-78c99df468-ts778\" (UID: \"399be673-c9fa-4c73-9706-3c82ffd26a3b\") " pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:06.979851 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:06.979739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/399be673-c9fa-4c73-9706-3c82ffd26a3b-config-file\") pod \"limitador-limitador-78c99df468-ts778\" (UID: \"399be673-c9fa-4c73-9706-3c82ffd26a3b\") " pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:07.080630 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:07.080593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/399be673-c9fa-4c73-9706-3c82ffd26a3b-config-file\") pod \"limitador-limitador-78c99df468-ts778\" (UID: \"399be673-c9fa-4c73-9706-3c82ffd26a3b\") " pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:07.080815 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:07.080666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdx6\" (UniqueName: \"kubernetes.io/projected/399be673-c9fa-4c73-9706-3c82ffd26a3b-kube-api-access-ccdx6\") pod \"limitador-limitador-78c99df468-ts778\" (UID: \"399be673-c9fa-4c73-9706-3c82ffd26a3b\") " pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:07.081239 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:07.081220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/399be673-c9fa-4c73-9706-3c82ffd26a3b-config-file\") pod \"limitador-limitador-78c99df468-ts778\" (UID: \"399be673-c9fa-4c73-9706-3c82ffd26a3b\") " pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:07.088556 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:07.088522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdx6\" (UniqueName: \"kubernetes.io/projected/399be673-c9fa-4c73-9706-3c82ffd26a3b-kube-api-access-ccdx6\") pod \"limitador-limitador-78c99df468-ts778\" (UID: \"399be673-c9fa-4c73-9706-3c82ffd26a3b\") " pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:07.205183 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:07.205148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:07.337341 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:07.337316 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:15:07.339889 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:15:07.339857 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod399be673_c9fa_4c73_9706_3c82ffd26a3b.slice/crio-936a9f426255ff21b13367c36dd4555df18f9d5cf802d659e9b3d03902a22d43 WatchSource:0}: Error finding container 936a9f426255ff21b13367c36dd4555df18f9d5cf802d659e9b3d03902a22d43: Status 404 returned error can't find the container with id 936a9f426255ff21b13367c36dd4555df18f9d5cf802d659e9b3d03902a22d43 Apr 16 22:15:08.146471 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:08.146426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" event={"ID":"399be673-c9fa-4c73-9706-3c82ffd26a3b","Type":"ContainerStarted","Data":"936a9f426255ff21b13367c36dd4555df18f9d5cf802d659e9b3d03902a22d43"} Apr 16 22:15:10.155881 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:10.155793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" event={"ID":"399be673-c9fa-4c73-9706-3c82ffd26a3b","Type":"ContainerStarted","Data":"4412193c57fc13afc59ac1692a789c843d8c730a724f56d0a3d645896bcc4be8"} Apr 16 22:15:10.156275 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:10.155911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:10.172841 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:10.172784 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" podStartSLOduration=1.647814684 podStartE2EDuration="4.172767182s" podCreationTimestamp="2026-04-16 22:15:06 +0000 UTC" firstStartedPulling="2026-04-16 22:15:07.341720721 +0000 UTC m=+618.815099469" lastFinishedPulling="2026-04-16 22:15:09.866673216 +0000 UTC m=+621.340051967" observedRunningTime="2026-04-16 22:15:10.171032042 +0000 UTC m=+621.644410817" watchObservedRunningTime="2026-04-16 22:15:10.172767182 +0000 UTC m=+621.646145951" Apr 16 22:15:21.165424 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:21.165390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-ts778" Apr 16 22:15:37.137785 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:15:37.137754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:16:17.223134 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:17.223092 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:16:21.975170 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:21.975136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:16:22.945840 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.945799 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q"] Apr 16 22:16:22.949411 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.949392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:22.952148 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.952125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 22:16:22.952263 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.952151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-khb59\"" Apr 16 22:16:22.952263 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.952176 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 22:16:22.952263 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.952176 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 22:16:22.959280 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:22.959258 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q"] Apr 16 22:16:23.094808 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.094774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.095180 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.094820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.095180 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.094848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.095180 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.094872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp5w\" (UniqueName: \"kubernetes.io/projected/4c8e49d5-d950-4634-930b-73d3d1cb899b-kube-api-access-nkp5w\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.095180 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.094926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.095180 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.094949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c8e49d5-d950-4634-930b-73d3d1cb899b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.195548 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.195785 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.195785 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.195785 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp5w\" (UniqueName: \"kubernetes.io/projected/4c8e49d5-d950-4634-930b-73d3d1cb899b-kube-api-access-nkp5w\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.195785 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.195785 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c8e49d5-d950-4634-930b-73d3d1cb899b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.196094 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.196094 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.195996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.196094 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.196042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.198038 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.198017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c8e49d5-d950-4634-930b-73d3d1cb899b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.198278 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.198260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c8e49d5-d950-4634-930b-73d3d1cb899b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.202574 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.202553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp5w\" (UniqueName: \"kubernetes.io/projected/4c8e49d5-d950-4634-930b-73d3d1cb899b-kube-api-access-nkp5w\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q\" (UID: \"4c8e49d5-d950-4634-930b-73d3d1cb899b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.260757 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.260719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:23.381632 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.381555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q"] Apr 16 22:16:23.383993 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:16:23.383964 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e49d5_d950_4634_930b_73d3d1cb899b.slice/crio-477c9f8d72fa19c00c60c331f8cc5662299a7a0061bcdd0d81090312e0fa3871 WatchSource:0}: Error finding container 477c9f8d72fa19c00c60c331f8cc5662299a7a0061bcdd0d81090312e0fa3871: Status 404 returned error can't find the container with id 477c9f8d72fa19c00c60c331f8cc5662299a7a0061bcdd0d81090312e0fa3871 Apr 16 22:16:23.403907 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:23.403877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" event={"ID":"4c8e49d5-d950-4634-930b-73d3d1cb899b","Type":"ContainerStarted","Data":"477c9f8d72fa19c00c60c331f8cc5662299a7a0061bcdd0d81090312e0fa3871"} Apr 16 22:16:25.477101 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:25.477060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:16:29.435946 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:29.435910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" event={"ID":"4c8e49d5-d950-4634-930b-73d3d1cb899b","Type":"ContainerStarted","Data":"1571e667cdebd9f3aa4d2a963f2aebd04bde2648c3b53dc08fbc47bb33164832"} Apr 16 22:16:34.455461 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:34.455426 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c8e49d5-d950-4634-930b-73d3d1cb899b" containerID="1571e667cdebd9f3aa4d2a963f2aebd04bde2648c3b53dc08fbc47bb33164832" exitCode=0 Apr 16 22:16:34.455985 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:34.455507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" event={"ID":"4c8e49d5-d950-4634-930b-73d3d1cb899b","Type":"ContainerDied","Data":"1571e667cdebd9f3aa4d2a963f2aebd04bde2648c3b53dc08fbc47bb33164832"} Apr 16 22:16:34.456274 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:34.456255 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:16:38.476112 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:38.476080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" event={"ID":"4c8e49d5-d950-4634-930b-73d3d1cb899b","Type":"ContainerStarted","Data":"f0fcafa4e553a9e1231bc6a766dffb06559af09aba69aa37bdd3d44ea59761f2"} Apr 16 22:16:38.476525 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:38.476287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:38.493888 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:38.493844 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" podStartSLOduration=1.695074538 podStartE2EDuration="16.49383247s" podCreationTimestamp="2026-04-16 22:16:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:23.385631103 +0000 UTC m=+694.859009851" lastFinishedPulling="2026-04-16 22:16:38.184389032 +0000 UTC m=+709.657767783" observedRunningTime="2026-04-16 22:16:38.491885932 +0000 UTC m=+709.965264727" watchObservedRunningTime="2026-04-16 22:16:38.49383247 +0000 UTC m=+709.967211239" Apr 16 22:16:42.684414 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:42.684369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:16:49.492881 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:49.492847 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q" Apr 16 22:16:56.369948 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.369911 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn"] Apr 16 22:16:56.431899 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.431864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn"] Apr 16 22:16:56.432070 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.431988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.434094 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.434067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 22:16:56.475201 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.475172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.475345 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.475207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.475345 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.475228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8356cb2-2bb8-4d40-8dba-fc76ca504951-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.475345 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.475291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.475454 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.475372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.475454 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.475400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4kd5\" (UniqueName: \"kubernetes.io/projected/a8356cb2-2bb8-4d40-8dba-fc76ca504951-kube-api-access-d4kd5\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.576779 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.576748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.576949 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.576786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.576949 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.576808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8356cb2-2bb8-4d40-8dba-fc76ca504951-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.576949 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.576829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.576949 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.576872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.576949 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.576899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4kd5\" (UniqueName: \"kubernetes.io/projected/a8356cb2-2bb8-4d40-8dba-fc76ca504951-kube-api-access-d4kd5\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.577210 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.577189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.577319 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.577291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.577360 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.577313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.579197 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.579172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8356cb2-2bb8-4d40-8dba-fc76ca504951-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.579342 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.579326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8356cb2-2bb8-4d40-8dba-fc76ca504951-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.583956 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.583935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4kd5\" (UniqueName: \"kubernetes.io/projected/a8356cb2-2bb8-4d40-8dba-fc76ca504951-kube-api-access-d4kd5\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn\" (UID: \"a8356cb2-2bb8-4d40-8dba-fc76ca504951\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.742611 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.742571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:16:56.864041 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:56.864007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn"] Apr 16 22:16:56.867512 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:16:56.867476 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8356cb2_2bb8_4d40_8dba_fc76ca504951.slice/crio-29f890ec174e0a37c6043355c9ad848bcf534f9d6f273b1f88bb3392fdbd4d88 WatchSource:0}: Error finding container 29f890ec174e0a37c6043355c9ad848bcf534f9d6f273b1f88bb3392fdbd4d88: Status 404 returned error can't find the container with id 29f890ec174e0a37c6043355c9ad848bcf534f9d6f273b1f88bb3392fdbd4d88 Apr 16 22:16:57.540974 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:57.540933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" event={"ID":"a8356cb2-2bb8-4d40-8dba-fc76ca504951","Type":"ContainerStarted","Data":"d2a1ff3b94bd2f85dfcb714e999b518ec28faeee72e708da7022f517da39b797"} Apr 16 22:16:57.540974 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:57.540979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" event={"ID":"a8356cb2-2bb8-4d40-8dba-fc76ca504951","Type":"ContainerStarted","Data":"29f890ec174e0a37c6043355c9ad848bcf534f9d6f273b1f88bb3392fdbd4d88"} Apr 16 22:16:57.787155 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:16:57.787122 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:17:02.559618 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:02.559571 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8356cb2-2bb8-4d40-8dba-fc76ca504951" containerID="d2a1ff3b94bd2f85dfcb714e999b518ec28faeee72e708da7022f517da39b797" exitCode=0 Apr 16 22:17:02.560115 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:02.559653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" event={"ID":"a8356cb2-2bb8-4d40-8dba-fc76ca504951","Type":"ContainerDied","Data":"d2a1ff3b94bd2f85dfcb714e999b518ec28faeee72e708da7022f517da39b797"} Apr 16 22:17:03.564668 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:03.564630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" event={"ID":"a8356cb2-2bb8-4d40-8dba-fc76ca504951","Type":"ContainerStarted","Data":"4f72d1938370be9a708c96fe2888e8ac27825130e75edc734880d728b3750f47"} Apr 16 22:17:03.565056 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:03.564872 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:17:03.584856 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:03.584806 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" podStartSLOduration=7.3811856670000004 podStartE2EDuration="7.584790015s" podCreationTimestamp="2026-04-16 22:16:56 +0000 UTC" firstStartedPulling="2026-04-16 22:17:02.560399917 +0000 UTC m=+734.033778665" lastFinishedPulling="2026-04-16 22:17:02.764004259 +0000 UTC m=+734.237383013" observedRunningTime="2026-04-16 22:17:03.583527823 +0000 UTC m=+735.056906600" watchObservedRunningTime="2026-04-16 22:17:03.584790015 +0000 UTC m=+735.058168785" Apr 16 22:17:12.381179 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:12.381143 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:17:14.580747 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:14.580714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn" Apr 16 22:17:55.375211 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:55.375174 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:17:59.780114 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:17:59.780077 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:18:06.282674 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:18:06.282592 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:18:16.684623 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:18:16.684584 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:18:25.373729 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:18:25.373673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:18:35.986086 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:18:35.986050 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:18:44.080594 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:18:44.080560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:18:54.884460 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:18:54.884423 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:19:49.055816 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:19:49.055739 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:19:49.056302 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:19:49.055953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:19:56.484545 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:19:56.484513 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:20:12.578285 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:20:12.578252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:20:50.384085 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:20:50.384053 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:21:07.487981 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:21:07.487899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:21:21.478898 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:21:21.478859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:21:37.489731 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:21:37.489674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:22:30.878734 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:22:30.878686 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:22:40.581451 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:22:40.581366 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:22:57.375846 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:22:57.375808 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:23:05.578142 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:23:05.578082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:23:21.876362 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:23:21.876323 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:23:31.184174 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:23:31.184136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:24:03.683753 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:03.683666 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:24:12.477330 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:12.477296 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:24:20.882911 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:20.882872 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:24:29.181574 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:29.181536 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:24:37.479873 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:37.479829 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:24:49.082473 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:49.082446 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:24:49.082938 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:49.082856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:24:54.175278 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:24:54.175241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:25:04.989009 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:25:04.988975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:25:51.178523 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:25:51.178483 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:25:59.386861 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:25:59.386819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:26:08.584293 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:26:08.584252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:26:16.987994 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:26:16.987957 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:26:26.281292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:26:26.281254 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:26:34.384644 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:26:34.384613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:26:43.794209 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:26:43.794167 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:26:52.798896 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:26:52.798856 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:01.303997 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:01.303960 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:09.702779 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:09.702688 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:18.819247 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:18.819212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:27.404141 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:27.404104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:36.433557 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:36.433518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:44.590349 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:44.590307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:27:53.711614 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:27:53.711578 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:28:02.487961 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:28:02.487924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:28:11.987855 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:28:11.987819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:28:19.882730 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:28:19.882674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:29:49.106576 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:29:49.106548 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:29:49.109313 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:29:49.109290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:30:00.140372 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.140337 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606310-qh5s6"] Apr 16 22:30:00.143597 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.143579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:30:00.145620 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.145603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-mjfcm\"" Apr 16 22:30:00.150152 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.150122 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606310-qh5s6"] Apr 16 22:30:00.262218 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.262177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gllnd\" (UniqueName: \"kubernetes.io/projected/087927f6-e0dd-4229-adfe-5b86a92d7460-kube-api-access-gllnd\") pod \"maas-api-key-cleanup-29606310-qh5s6\" (UID: \"087927f6-e0dd-4229-adfe-5b86a92d7460\") " pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:30:00.362872 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.362837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gllnd\" (UniqueName: \"kubernetes.io/projected/087927f6-e0dd-4229-adfe-5b86a92d7460-kube-api-access-gllnd\") pod \"maas-api-key-cleanup-29606310-qh5s6\" (UID: \"087927f6-e0dd-4229-adfe-5b86a92d7460\") " pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:30:00.370653 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.370628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gllnd\" (UniqueName: \"kubernetes.io/projected/087927f6-e0dd-4229-adfe-5b86a92d7460-kube-api-access-gllnd\") pod \"maas-api-key-cleanup-29606310-qh5s6\" (UID: \"087927f6-e0dd-4229-adfe-5b86a92d7460\") " pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:30:00.454862 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.454831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:30:00.577403 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.577370 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606310-qh5s6"] Apr 16 22:30:00.580837 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:30:00.580808 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087927f6_e0dd_4229_adfe_5b86a92d7460.slice/crio-c9a0629db0f2027debd927e65f306ca340187717c4a1dc7d68a48573bd4ad00d WatchSource:0}: Error finding container c9a0629db0f2027debd927e65f306ca340187717c4a1dc7d68a48573bd4ad00d: Status 404 returned error can't find the container with id c9a0629db0f2027debd927e65f306ca340187717c4a1dc7d68a48573bd4ad00d Apr 16 22:30:00.582972 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:00.582952 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:30:01.277025 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:01.276988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" event={"ID":"087927f6-e0dd-4229-adfe-5b86a92d7460","Type":"ContainerStarted","Data":"c9a0629db0f2027debd927e65f306ca340187717c4a1dc7d68a48573bd4ad00d"} Apr 16 22:30:03.286274 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:03.286237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" event={"ID":"087927f6-e0dd-4229-adfe-5b86a92d7460","Type":"ContainerStarted","Data":"5490379f26c1a80018b586d48a4bac94e91e99ab905dd061e5c762b5e795c034"} Apr 16 22:30:03.301360 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:03.301304 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" podStartSLOduration=1.30616529 podStartE2EDuration="3.301291136s" podCreationTimestamp="2026-04-16 22:30:00 +0000 UTC" firstStartedPulling="2026-04-16 22:30:00.583086732 +0000 UTC m=+1512.056465482" lastFinishedPulling="2026-04-16 22:30:02.57821258 +0000 UTC m=+1514.051591328" observedRunningTime="2026-04-16 22:30:03.299528258 +0000 UTC m=+1514.772907039" watchObservedRunningTime="2026-04-16 22:30:03.301291136 +0000 UTC m=+1514.774669906" Apr 16 22:30:23.363265 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:23.363225 2576 generic.go:358] "Generic (PLEG): container finished" podID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerID="5490379f26c1a80018b586d48a4bac94e91e99ab905dd061e5c762b5e795c034" exitCode=6 Apr 16 22:30:23.363721 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:23.363298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" event={"ID":"087927f6-e0dd-4229-adfe-5b86a92d7460","Type":"ContainerDied","Data":"5490379f26c1a80018b586d48a4bac94e91e99ab905dd061e5c762b5e795c034"} Apr 16 22:30:23.363808 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:23.363719 2576 scope.go:117] "RemoveContainer" containerID="5490379f26c1a80018b586d48a4bac94e91e99ab905dd061e5c762b5e795c034" Apr 16 22:30:24.368472 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:24.368439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" event={"ID":"087927f6-e0dd-4229-adfe-5b86a92d7460","Type":"ContainerStarted","Data":"8ef2af85cdd3cf5d77b6793d7ff2a025aeae6306dbf08ac1273771cfb055d064"} Apr 16 22:30:35.775903 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:35.775864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:30:43.187435 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:43.187398 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:30:44.437885 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:44.437852 2576 generic.go:358] "Generic (PLEG): container finished" podID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerID="8ef2af85cdd3cf5d77b6793d7ff2a025aeae6306dbf08ac1273771cfb055d064" exitCode=6 Apr 16 22:30:44.438354 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:44.437914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" event={"ID":"087927f6-e0dd-4229-adfe-5b86a92d7460","Type":"ContainerDied","Data":"8ef2af85cdd3cf5d77b6793d7ff2a025aeae6306dbf08ac1273771cfb055d064"} Apr 16 22:30:44.438354 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:44.437948 2576 scope.go:117] "RemoveContainer" containerID="5490379f26c1a80018b586d48a4bac94e91e99ab905dd061e5c762b5e795c034" Apr 16 22:30:44.438354 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:30:44.438315 2576 scope.go:117] "RemoveContainer" containerID="8ef2af85cdd3cf5d77b6793d7ff2a025aeae6306dbf08ac1273771cfb055d064" Apr 16 22:30:44.438568 ip-10-0-129-68 kubenswrapper[2576]: E0416 22:30:44.438546 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606310-qh5s6_opendatahub(087927f6-e0dd-4229-adfe-5b86a92d7460)\"" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" Apr 16 22:31:00.010553 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.010516 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606310-qh5s6"] Apr 16 22:31:00.140058 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.140029 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:31:00.189005 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.188973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gllnd\" (UniqueName: \"kubernetes.io/projected/087927f6-e0dd-4229-adfe-5b86a92d7460-kube-api-access-gllnd\") pod \"087927f6-e0dd-4229-adfe-5b86a92d7460\" (UID: \"087927f6-e0dd-4229-adfe-5b86a92d7460\") " Apr 16 22:31:00.191242 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.191213 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087927f6-e0dd-4229-adfe-5b86a92d7460-kube-api-access-gllnd" (OuterVolumeSpecName: "kube-api-access-gllnd") pod "087927f6-e0dd-4229-adfe-5b86a92d7460" (UID: "087927f6-e0dd-4229-adfe-5b86a92d7460"). InnerVolumeSpecName "kube-api-access-gllnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:31:00.290159 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.290082 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gllnd\" (UniqueName: \"kubernetes.io/projected/087927f6-e0dd-4229-adfe-5b86a92d7460-kube-api-access-gllnd\") on node \"ip-10-0-129-68.ec2.internal\" DevicePath \"\"" Apr 16 22:31:00.505585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.505550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" event={"ID":"087927f6-e0dd-4229-adfe-5b86a92d7460","Type":"ContainerDied","Data":"c9a0629db0f2027debd927e65f306ca340187717c4a1dc7d68a48573bd4ad00d"} Apr 16 22:31:00.505585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.505566 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606310-qh5s6" Apr 16 22:31:00.505585 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.505594 2576 scope.go:117] "RemoveContainer" containerID="8ef2af85cdd3cf5d77b6793d7ff2a025aeae6306dbf08ac1273771cfb055d064" Apr 16 22:31:00.525087 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.525059 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606310-qh5s6"] Apr 16 22:31:00.527138 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:00.527114 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606310-qh5s6"] Apr 16 22:31:01.138034 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:01.137996 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" path="/var/lib/kubelet/pods/087927f6-e0dd-4229-adfe-5b86a92d7460/volumes" Apr 16 22:31:08.189999 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:08.189969 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:31:12.584709 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:12.584654 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:31:22.985861 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:22.985828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:31:32.485889 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:32.485846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:31:41.192978 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:41.192941 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:31:52.179395 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:31:52.179356 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:32:00.789713 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:32:00.789656 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:32:11.579837 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:32:11.579800 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:32:20.484290 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:32:20.484247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:32:31.381238 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:32:31.381198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:32:40.786612 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:32:40.786575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:33:15.380634 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:33:15.380563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:33:58.478193 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:33:58.478148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:34:06.587887 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:06.587845 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:34:14.483067 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:14.483028 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:34:23.692242 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:23.692201 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:34:32.887264 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:32.887178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:34:45.778468 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:45.778426 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:34:49.131669 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:49.131643 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:34:49.135197 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:49.135175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:34:53.192471 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:34:53.192435 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:35:00.982646 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:35:00.982598 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:35:10.581210 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:35:10.581173 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:35:18.688365 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:35:18.688329 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:35:26.487640 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:35:26.487604 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:35:37.083359 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:35:37.083322 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:35:54.880915 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:35:54.880879 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:36:03.593234 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:36:03.593154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:36:12.093788 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:36:12.093747 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:36:21.086316 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:36:21.086282 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:36:37.987484 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:36:37.987449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:36:45.686808 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:36:45.686773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:36:54.780470 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:36:54.780429 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:37:02.981493 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:37:02.981459 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:37:11.686800 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:37:11.686762 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:37:20.594027 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:37:20.593991 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:37:29.776967 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:37:29.776929 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:37:40.778464 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:37:40.778379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:37:49.791379 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:37:49.791342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:00.912415 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:00.912379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:09.597867 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:09.597831 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:18.602875 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:18.602836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:26.583352 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:26.583312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:34.485372 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:34.485335 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:50.680354 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:50.680310 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:38:58.985376 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:38:58.985344 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:39:08.892414 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:08.892323 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:39:16.783184 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:16.783138 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:39:41.383722 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:41.383664 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:39:49.156573 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:49.156539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:39:49.160288 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:49.160265 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:39:52.386114 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:52.386078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ts778"] Apr 16 22:39:58.465011 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:58.464976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-l8fml_1d3b2559-e6ac-4fd3-ab02-a0e626613f92/manager/0.log" Apr 16 22:39:58.849277 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:58.849194 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-4zzf4_1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a/manager/2.log" Apr 16 22:39:58.960492 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:39:58.960455 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-674f8cc5cf-h657d_42fa1204-d28a-413e-9cb1-ad8db42994af/manager/0.log" Apr 16 22:40:01.228878 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:01.228848 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-ts778_399be673-c9fa-4c73-9706-3c82ffd26a3b/limitador/0.log" Apr 16 22:40:01.345920 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:01.345885 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-p2465_49a88115-b1b6-4e3a-b806-2810565bea72/manager/0.log" Apr 16 22:40:01.799292 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:01.799261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-f6qtk_545b828a-0345-4b9d-a2d0-2f95cbc996d7/discovery/0.log" Apr 16 22:40:02.024249 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:02.024219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c9f6bcb5c-x6fqd_6ed6f5be-2f34-44d2-b714-759092f582f3/kube-auth-proxy/0.log" Apr 16 22:40:02.598977 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:02.598945 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q_4c8e49d5-d950-4634-930b-73d3d1cb899b/storage-initializer/0.log" Apr 16 22:40:02.606600 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:02.606565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-nzl5q_4c8e49d5-d950-4634-930b-73d3d1cb899b/main/0.log" Apr 16 22:40:03.061936 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:03.061905 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn_a8356cb2-2bb8-4d40-8dba-fc76ca504951/storage-initializer/0.log" Apr 16 22:40:03.069098 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:03.069063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-lslwn_a8356cb2-2bb8-4d40-8dba-fc76ca504951/main/0.log" Apr 16 22:40:09.783127 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:09.783087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-c2gt5_d899808a-e158-4915-b6c2-f135d5b829ef/global-pull-secret-syncer/0.log" Apr 16 22:40:09.935500 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:09.935465 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bdwfp_95d30843-3d5e-42ad-94ae-c9a2c65d3e0a/konnectivity-agent/0.log" Apr 16 22:40:09.954309 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:09.954285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-68.ec2.internal_67efa12b1e612144155ca84bcb8df9e9/haproxy/0.log" Apr 16 22:40:14.572795 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:14.572760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-ts778_399be673-c9fa-4c73-9706-3c82ffd26a3b/limitador/0.log" Apr 16 22:40:14.668247 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:14.668215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-p2465_49a88115-b1b6-4e3a-b806-2810565bea72/manager/0.log" Apr 16 22:40:16.227052 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:16.227020 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69m6w_d91095be-a477-4c1e-bd66-36fd94142428/node-exporter/0.log" Apr 16 22:40:16.264381 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:16.264309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69m6w_d91095be-a477-4c1e-bd66-36fd94142428/kube-rbac-proxy/0.log" Apr 16 22:40:16.297294 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:16.297270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69m6w_d91095be-a477-4c1e-bd66-36fd94142428/init-textfile/0.log" Apr 16 22:40:18.110776 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.110746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-s42mm_0f18a248-9dfa-4c91-b4cd-46c1f19634f1/networking-console-plugin/0.log" Apr 16 22:40:18.407327 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.407249 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7"] Apr 16 22:40:18.407563 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.407551 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerName="cleanup" Apr 16 22:40:18.407620 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.407565 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerName="cleanup" Apr 16 22:40:18.407620 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.407584 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerName="cleanup" Apr 16 22:40:18.407620 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.407597 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerName="cleanup" Apr 16 22:40:18.407735 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.407654 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="087927f6-e0dd-4229-adfe-5b86a92d7460" containerName="cleanup" Apr 16 22:40:18.410769 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.410749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.413237 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.413216 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wq7jn\"/\"openshift-service-ca.crt\"" Apr 16 22:40:18.413922 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.413908 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wq7jn\"/\"kube-root-ca.crt\"" Apr 16 22:40:18.413981 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.413919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wq7jn\"/\"default-dockercfg-9fxhp\"" Apr 16 22:40:18.418802 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.418776 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7"] Apr 16 22:40:18.546514 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.546464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm8zz\" (UniqueName: \"kubernetes.io/projected/f1f94d6e-d8b5-4eae-bfa3-815529c03292-kube-api-access-mm8zz\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.546688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.546564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-proc\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.546688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.546606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-lib-modules\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.546688 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.546630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-sys\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.546820 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.546718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-podres\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647512 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm8zz\" (UniqueName: \"kubernetes.io/projected/f1f94d6e-d8b5-4eae-bfa3-815529c03292-kube-api-access-mm8zz\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647717 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-proc\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647717 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-lib-modules\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647717 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-sys\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647717 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-sys\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647717 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-podres\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647925 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-lib-modules\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647925 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-proc\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.647925 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.647753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f1f94d6e-d8b5-4eae-bfa3-815529c03292-podres\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.656285 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.656255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm8zz\" (UniqueName: \"kubernetes.io/projected/f1f94d6e-d8b5-4eae-bfa3-815529c03292-kube-api-access-mm8zz\") pod \"perf-node-gather-daemonset-4jbn7\" (UID: \"f1f94d6e-d8b5-4eae-bfa3-815529c03292\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.720968 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.720939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:18.846237 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.846191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7"] Apr 16 22:40:18.848652 ip-10-0-129-68 kubenswrapper[2576]: W0416 22:40:18.848624 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf1f94d6e_d8b5_4eae_bfa3_815529c03292.slice/crio-c613b2b02709dbfb3300a8ae7797820d8c13bc436584a7ac3bf45e0aa32224b5 WatchSource:0}: Error finding container c613b2b02709dbfb3300a8ae7797820d8c13bc436584a7ac3bf45e0aa32224b5: Status 404 returned error can't find the container with id c613b2b02709dbfb3300a8ae7797820d8c13bc436584a7ac3bf45e0aa32224b5 Apr 16 22:40:18.850220 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:18.850204 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:40:19.465970 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:19.465934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" event={"ID":"f1f94d6e-d8b5-4eae-bfa3-815529c03292","Type":"ContainerStarted","Data":"5f9f57fe0a93a960de2cc945b41d8b0f04b98d6fc38cdc739583ddcad04f331a"} Apr 16 22:40:19.465970 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:19.465971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" event={"ID":"f1f94d6e-d8b5-4eae-bfa3-815529c03292","Type":"ContainerStarted","Data":"c613b2b02709dbfb3300a8ae7797820d8c13bc436584a7ac3bf45e0aa32224b5"} Apr 16 22:40:19.466373 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:19.466008 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:19.481124 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:19.481078 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" podStartSLOduration=1.481064501 podStartE2EDuration="1.481064501s" podCreationTimestamp="2026-04-16 22:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:40:19.479965226 +0000 UTC m=+2130.953343996" watchObservedRunningTime="2026-04-16 22:40:19.481064501 +0000 UTC m=+2130.954443264" Apr 16 22:40:20.428554 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:20.428521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-r2hfz_270e029b-17e5-4312-8fcc-59dfc7eecac7/dns/0.log" Apr 16 22:40:20.446887 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:20.446861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-r2hfz_270e029b-17e5-4312-8fcc-59dfc7eecac7/kube-rbac-proxy/0.log" Apr 16 22:40:20.552093 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:20.552063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4llp9_e04f9b26-0017-48cc-a5f0-a9c2bae5d9df/dns-node-resolver/0.log" Apr 16 22:40:21.120751 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:21.120716 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tfhck_03c81f44-bba2-4d54-b6db-157f9d7e76c7/node-ca/0.log" Apr 16 22:40:22.086295 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:22.086258 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-f6qtk_545b828a-0345-4b9d-a2d0-2f95cbc996d7/discovery/0.log" Apr 16 22:40:22.126828 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:22.126802 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c9f6bcb5c-x6fqd_6ed6f5be-2f34-44d2-b714-759092f582f3/kube-auth-proxy/0.log" Apr 16 22:40:22.720734 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:22.720683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2pb4t_97234e5c-490f-432d-a702-1a85fbcc4044/serve-healthcheck-canary/0.log" Apr 16 22:40:23.233998 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:23.233970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4x6xj_e6687b77-55a8-40de-b6f7-e53478b1e21b/kube-rbac-proxy/0.log" Apr 16 22:40:23.255218 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:23.255184 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4x6xj_e6687b77-55a8-40de-b6f7-e53478b1e21b/exporter/0.log" Apr 16 22:40:23.275550 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:23.275504 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4x6xj_e6687b77-55a8-40de-b6f7-e53478b1e21b/extractor/0.log" Apr 16 22:40:25.234057 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:25.234013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-l8fml_1d3b2559-e6ac-4fd3-ab02-a0e626613f92/manager/0.log" Apr 16 22:40:25.378609 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:25.378582 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-4zzf4_1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a/manager/1.log" Apr 16 22:40:25.399755 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:25.399718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-4zzf4_1f4e3ca8-23e2-4f21-95c4-8e25b4582c9a/manager/2.log" Apr 16 22:40:25.423345 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:25.423316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-674f8cc5cf-h657d_42fa1204-d28a-413e-9cb1-ad8db42994af/manager/0.log" Apr 16 22:40:25.479091 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:25.479067 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-4jbn7" Apr 16 22:40:26.663818 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:26.663789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-q4fq7_14d40e20-7602-4045-95a7-0b73bf25f04e/openshift-lws-operator/0.log" Apr 16 22:40:32.496558 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.496482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/kube-multus-additional-cni-plugins/0.log" Apr 16 22:40:32.520415 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.520388 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/egress-router-binary-copy/0.log" Apr 16 22:40:32.539793 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.539767 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/cni-plugins/0.log" Apr 16 22:40:32.558485 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.558464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/bond-cni-plugin/0.log" Apr 16 22:40:32.576924 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.576898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/routeoverride-cni/0.log" Apr 16 22:40:32.596912 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.596892 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/whereabouts-cni-bincopy/0.log" Apr 16 22:40:32.624171 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:32.624148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2gb76_995fbb71-6c0e-4689-8c49-6fd0c1a79f15/whereabouts-cni/0.log" Apr 16 22:40:33.040840 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:33.040806 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vlrfv_482f4fcf-1af7-4c0a-a8d2-c059af41fba7/kube-multus/0.log" Apr 16 22:40:33.058798 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:33.058761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hzjxc_6690fd79-9fd1-41a1-acf7-d29fd96d4757/network-metrics-daemon/0.log" Apr 16 22:40:33.075353 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:33.075321 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hzjxc_6690fd79-9fd1-41a1-acf7-d29fd96d4757/kube-rbac-proxy/0.log" Apr 16 22:40:34.012989 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.012962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-controller/0.log" Apr 16 22:40:34.028534 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.028510 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/0.log" Apr 16 22:40:34.047307 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.047280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovn-acl-logging/1.log" Apr 16 22:40:34.070924 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.070904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/kube-rbac-proxy-node/0.log" Apr 16 22:40:34.091963 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.091941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:40:34.107297 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.107275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/northd/0.log" Apr 16 22:40:34.125359 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.125329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/nbdb/0.log" Apr 16 22:40:34.144125 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.144107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/sbdb/0.log" Apr 16 22:40:34.306797 ip-10-0-129-68 kubenswrapper[2576]: I0416 22:40:34.306712 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25llf_d7498930-9a40-4a06-a45f-79c56cdfd2e3/ovnkube-controller/0.log"