Apr 20 07:47:34.220319 ip-10-0-138-4 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 07:47:34.220330 ip-10-0-138-4 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 07:47:34.220338 ip-10-0-138-4 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 07:47:34.220549 ip-10-0-138-4 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 07:47:44.253017 ip-10-0-138-4 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 07:47:44.253032 ip-10-0-138-4 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f5d968e4eeb54cc2a9287d738748c455 -- Apr 20 07:50:06.582462 ip-10-0-138-4 systemd[1]: Starting Kubernetes Kubelet... Apr 20 07:50:07.049558 ip-10-0-138-4 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:07.049558 ip-10-0-138-4 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 07:50:07.049558 ip-10-0-138-4 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:07.049558 ip-10-0-138-4 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 07:50:07.049558 ip-10-0-138-4 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:07.051103 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.051013 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 07:50:07.054102 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054088 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:07.054102 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054102 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054105 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054109 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054112 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054116 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054119 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054122 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054126 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054129 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054132 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054140 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054143 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054146 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054149 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054152 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054154 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054157 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054159 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054162 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054164 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:07.054167 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054167 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054170 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054172 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054175 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054179 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054182 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054185 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054187 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054190 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054192 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054195 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054198 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054200 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054203 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054217 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054228 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054232 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054236 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054238 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054241 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:07.054668 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054243 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054246 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054249 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054252 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054254 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054257 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054262 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054266 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054269 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054272 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054274 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054277 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054280 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054282 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054286 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054290 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054294 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054297 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054300 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:07.055176 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054302 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054305 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054308 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054310 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054313 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054315 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054318 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054320 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054323 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054326 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054329 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054331 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054334 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054336 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054339 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054341 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054343 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054346 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054349 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:07.055699 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054351 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054354 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054358 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054361 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054363 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054366 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054368 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054728 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054735 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054738 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054741 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054743 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054746 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054748 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054751 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054754 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054756 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054759 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054761 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054764 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:07.056191 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054767 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054769 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054772 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054775 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054777 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054780 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054782 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054785 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054787 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054790 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054792 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054795 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054798 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054801 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054803 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054806 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054809 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054811 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054814 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054816 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:07.056693 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054820 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054823 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054826 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054828 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054831 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054833 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054836 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054838 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054841 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054843 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054846 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054849 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054852 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054854 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054856 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054860 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054862 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054865 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054867 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054870 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:07.057380 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054872 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054875 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054877 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054881 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054885 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054888 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054891 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054894 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054896 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054899 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054901 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054904 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054909 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054912 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054916 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054919 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054922 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054924 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054927 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:07.058109 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054930 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054932 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054936 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054938 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054941 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054944 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054947 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054950 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054953 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054956 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054959 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054962 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054964 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.054967 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056311 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056321 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056327 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056332 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056336 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056339 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056343 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 07:50:07.058621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056348 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056351 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056354 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056358 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056362 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056365 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056368 2569 flags.go:64] FLAG: --cgroup-root="" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056371 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056374 2569 flags.go:64] FLAG: --client-ca-file="" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056377 2569 flags.go:64] FLAG: --cloud-config="" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056380 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056382 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056389 2569 flags.go:64] FLAG: --cluster-domain="" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056392 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056395 2569 flags.go:64] FLAG: --config-dir="" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056398 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056401 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056405 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056408 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056411 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056415 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056418 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056421 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056424 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056427 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 07:50:07.059138 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056430 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056434 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056437 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056440 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056443 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056447 2569 flags.go:64] FLAG: --enable-server="true" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056450 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056454 2569 flags.go:64] FLAG: --event-burst="100" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056457 2569 flags.go:64] FLAG: --event-qps="50" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056460 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056464 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056467 2569 flags.go:64] FLAG: --eviction-hard="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056471 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056474 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056477 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056480 2569 flags.go:64] FLAG: --eviction-soft="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056483 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056486 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056489 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056492 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056495 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056497 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056500 2569 flags.go:64] FLAG: --feature-gates="" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056504 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056508 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 07:50:07.059760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056511 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056514 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056517 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056520 2569 flags.go:64] FLAG: --help="false" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056524 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056527 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056530 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056533 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056536 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056540 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056543 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056547 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056550 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056552 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056558 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056561 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056564 2569 flags.go:64] FLAG: --kube-reserved="" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056567 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056570 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056573 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056576 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056579 2569 flags.go:64] FLAG: --lock-file="" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056582 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056585 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 07:50:07.060384 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056588 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056594 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056596 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056599 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056602 2569 flags.go:64] FLAG: --logging-format="text" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056605 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056608 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056611 2569 flags.go:64] FLAG: --manifest-url="" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056614 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056619 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056622 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056626 2569 flags.go:64] FLAG: --max-pods="110" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056629 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056632 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056635 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056638 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056641 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056644 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056647 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056654 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056657 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056660 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056665 2569 flags.go:64] FLAG: --pod-cidr="" Apr 20 07:50:07.060980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056668 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056673 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056676 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056679 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056682 2569 flags.go:64] FLAG: --port="10250" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056685 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056688 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08bcb56a8740c36b2" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056691 2569 flags.go:64] FLAG: --qos-reserved="" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056695 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056698 2569 flags.go:64] FLAG: --register-node="true" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056701 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056704 2569 flags.go:64] FLAG: --register-with-taints="" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056708 2569 flags.go:64] FLAG: --registry-burst="10" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056711 2569 flags.go:64] FLAG: --registry-qps="5" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056714 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056716 2569 flags.go:64] FLAG: --reserved-memory="" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056721 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056723 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056726 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056729 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056732 2569 flags.go:64] FLAG: --runonce="false" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056735 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056738 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056741 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056744 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056747 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 07:50:07.061551 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056750 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056753 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056756 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056759 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056762 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056765 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056768 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056771 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056774 2569 flags.go:64] FLAG: --system-cgroups="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056777 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056782 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056785 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056788 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056792 2569 flags.go:64] FLAG: --tls-min-version="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056795 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056798 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056801 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056803 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056806 2569 flags.go:64] FLAG: --v="2" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056810 2569 flags.go:64] FLAG: --version="false" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056814 2569 flags.go:64] FLAG: --vmodule="" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056823 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.056826 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057377 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057390 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:07.062183 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057395 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057400 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057404 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057408 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057412 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057418 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057423 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057428 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057438 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057445 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057450 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057454 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057458 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057463 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057467 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057471 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057475 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057479 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057484 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057488 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:07.062796 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057497 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057501 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057505 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057509 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057514 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057518 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057522 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057526 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057530 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057534 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057539 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057543 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057547 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057556 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057560 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057564 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057568 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057573 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057578 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057583 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:07.063392 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057587 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057593 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057598 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057603 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057607 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057611 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057620 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057625 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057629 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057633 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057637 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057641 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057648 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057654 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057660 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057664 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057667 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057670 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057678 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:07.063879 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057680 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057684 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057686 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057689 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057692 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057694 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057697 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057700 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057706 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057713 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057718 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057726 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057740 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057745 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057751 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057755 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057760 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057764 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057768 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:07.064395 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057772 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:07.064871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057776 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:07.064871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057780 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:07.064871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057784 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:07.064871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057788 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:07.064871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.057797 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:07.064871 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.057806 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:07.065168 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.065055 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 07:50:07.065201 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.065170 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 07:50:07.065246 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065231 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:07.065246 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065237 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:07.065246 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065241 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:07.065246 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065244 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:07.065246 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065247 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065250 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065253 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065256 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065259 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065262 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065265 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065268 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065271 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065273 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065276 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065279 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065283 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065285 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065288 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065290 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065293 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065295 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065298 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065300 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:07.065374 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065303 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065305 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065308 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065310 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065313 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065317 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065322 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065325 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065328 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065330 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065333 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065336 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065338 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065341 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065345 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065347 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065350 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065353 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065355 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:07.065871 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065358 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065360 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065363 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065366 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065369 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065373 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065376 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065379 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065382 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065385 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065388 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065391 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065393 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065396 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065398 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065401 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065404 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065406 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065409 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:07.066357 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065412 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065415 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065417 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065420 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065422 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065425 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065428 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065430 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065434 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065437 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065439 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065442 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065445 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065447 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065450 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065452 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065455 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065458 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065460 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065463 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:07.066858 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065465 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065468 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065470 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065473 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.065479 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065571 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065577 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065580 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065584 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065587 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065589 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065592 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065595 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065599 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065603 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:07.067354 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065606 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065610 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065614 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065617 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065619 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065622 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065625 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065628 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065630 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065633 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065635 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065638 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065641 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065643 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065646 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065648 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065651 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065653 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065655 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065658 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:07.067732 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065660 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065663 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065665 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065668 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065671 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065673 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065675 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065678 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065680 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065683 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065686 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065688 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065691 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065693 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065696 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065698 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065700 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065703 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065706 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:07.068297 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065708 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065711 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065713 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065716 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065718 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065721 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065723 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065726 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065728 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065731 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065734 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065736 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065739 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065741 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065743 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065746 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065748 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065752 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065754 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:07.068760 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065756 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065759 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065761 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065764 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065766 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065769 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065772 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065774 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065776 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065779 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065781 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065784 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065787 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065789 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065792 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065794 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065797 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:07.069220 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:07.065799 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:07.069692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.065804 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:07.069692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.066524 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 07:50:07.069692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.069253 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 07:50:07.070087 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.070075 2569 server.go:1019] "Starting client certificate rotation" Apr 20 07:50:07.070185 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.070169 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:50:07.070239 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.070232 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:50:07.099786 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.099769 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:50:07.103267 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.103248 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:50:07.121978 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.121955 2569 log.go:25] "Validated CRI v1 runtime API" Apr 20 07:50:07.128334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.128319 2569 log.go:25] "Validated CRI v1 image API" Apr 20 07:50:07.130312 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.130292 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 07:50:07.132429 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.132407 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:50:07.132640 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.132621 2569 fs.go:135] Filesystem UUIDs: map[3665b458-c274-459f-9085-cda218fa7ff6:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 bc028ac8-c3c9-4df0-bfe3-b671c3935468:/dev/nvme0n1p4] Apr 20 07:50:07.132677 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.132641 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 07:50:07.139411 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.139300 2569 manager.go:217] Machine: {Timestamp:2026-04-20 07:50:07.136294718 +0000 UTC m=+0.432867714 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102117 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28d258b55aece792f5b535cf771066 SystemUUID:ec28d258-b55a-ece7-92f5-b535cf771066 BootID:f5d968e4-eeb5-4cc2-a928-7d738748c455 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a5:1f:34:06:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a5:1f:34:06:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:9e:64:80:1c:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 07:50:07.139411 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.139406 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 07:50:07.139539 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.139527 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 07:50:07.142076 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.142048 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 07:50:07.142230 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.142078 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-4.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 07:50:07.142274 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.142242 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 07:50:07.142274 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.142251 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 07:50:07.142274 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.142264 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:50:07.143235 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.143225 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:50:07.144510 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.144499 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:50:07.144614 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.144605 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 07:50:07.146373 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.146357 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-288zf" Apr 20 07:50:07.146622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.146613 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 20 07:50:07.146657 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.146626 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 07:50:07.146657 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.146638 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 07:50:07.146657 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.146647 2569 kubelet.go:397] "Adding apiserver pod source" Apr 20 07:50:07.146657 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.146656 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 07:50:07.147716 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.147705 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:50:07.147761 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.147723 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:50:07.151775 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.151760 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 07:50:07.153238 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.153223 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 07:50:07.153303 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.153290 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-288zf" Apr 20 07:50:07.155202 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155182 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155221 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155229 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155235 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155241 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155247 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155253 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155258 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155265 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 07:50:07.155279 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155271 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 07:50:07.155503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155286 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 07:50:07.155503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.155296 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 07:50:07.156084 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.156074 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 07:50:07.156132 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.156084 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 07:50:07.159867 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.159853 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 07:50:07.159971 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.159894 2569 server.go:1295] "Started kubelet" Apr 20 07:50:07.160018 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.159979 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 07:50:07.160141 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.160089 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 07:50:07.160197 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.160172 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 07:50:07.160785 ip-10-0-138-4 systemd[1]: Started Kubernetes Kubelet. Apr 20 07:50:07.164157 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.164080 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:07.164307 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.164287 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 07:50:07.165697 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.165659 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 20 07:50:07.167325 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.167302 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:07.168509 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.168410 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-4.ec2.internal" not found Apr 20 07:50:07.169514 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.169492 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 07:50:07.170427 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.170409 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 07:50:07.170727 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.170710 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 07:50:07.171440 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171413 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 07:50:07.171520 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.171453 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-4.ec2.internal\" not found" Apr 20 07:50:07.171620 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171597 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 07:50:07.171620 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171613 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 07:50:07.171620 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171619 2569 factory.go:55] Registering systemd factory Apr 20 07:50:07.171620 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171625 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 07:50:07.171877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171629 2569 factory.go:223] Registration of the systemd container factory successfully Apr 20 07:50:07.171877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171730 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 20 07:50:07.171877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171738 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 20 07:50:07.171877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171846 2569 factory.go:153] Registering CRI-O factory Apr 20 07:50:07.171877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171856 2569 factory.go:223] Registration of the crio container factory successfully Apr 20 07:50:07.171877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171877 2569 factory.go:103] Registering Raw factory Apr 20 07:50:07.172073 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.171891 2569 manager.go:1196] Started watching for new ooms in manager Apr 20 07:50:07.172366 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.172353 2569 manager.go:319] Starting recovery of all containers Apr 20 07:50:07.173677 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.173654 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:07.176752 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.176724 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-4.ec2.internal\" not found" node="ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.181121 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.181106 2569 manager.go:324] Recovery completed Apr 20 07:50:07.183882 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.183865 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-4.ec2.internal" not found Apr 20 07:50:07.185493 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.185477 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:07.187311 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.187297 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:07.187383 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.187331 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:07.187383 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.187342 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:07.187786 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.187773 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 07:50:07.187786 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.187785 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 07:50:07.187902 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.187805 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:50:07.190736 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.190722 2569 policy_none.go:49] "None policy: Start" Apr 20 07:50:07.190801 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.190741 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 07:50:07.190801 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.190754 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 20 07:50:07.227746 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.227733 2569 manager.go:341] "Starting Device Plugin manager" Apr 20 07:50:07.227827 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.227762 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 07:50:07.227827 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.227771 2569 server.go:85] "Starting device plugin registration server" Apr 20 07:50:07.228026 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.228013 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 07:50:07.228074 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.228030 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 07:50:07.228198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.228177 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 07:50:07.228327 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.228272 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 07:50:07.228327 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.228283 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 07:50:07.229062 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.229043 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 07:50:07.229129 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.229084 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-4.ec2.internal\" not found" Apr 20 07:50:07.240344 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.240325 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-4.ec2.internal" not found Apr 20 07:50:07.274335 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.274310 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 07:50:07.275494 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.275480 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 07:50:07.275559 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.275505 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 07:50:07.275559 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.275523 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 07:50:07.275559 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.275529 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 07:50:07.275699 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.275590 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 07:50:07.278694 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.278677 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:07.328681 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.328630 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:07.329921 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.329908 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:07.329986 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.329934 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:07.329986 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.329946 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:07.329986 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.329975 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.338278 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.338262 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.338322 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:07.338282 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-4.ec2.internal\": node \"ip-10-0-138-4.ec2.internal\" not found" Apr 20 07:50:07.376261 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.376238 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal"] Apr 20 07:50:07.378660 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.378647 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.378733 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.378654 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.403734 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.403716 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.408054 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.408042 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.413312 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.413298 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:50:07.419069 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.419056 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:50:07.474261 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.474241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c9000e672991f209623b410921cb239-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal\" (UID: \"9c9000e672991f209623b410921cb239\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.474369 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.474270 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c9000e672991f209623b410921cb239-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal\" (UID: \"9c9000e672991f209623b410921cb239\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.474369 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.474290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/53a487abe77507ab89cb5cf1017a5b1f-config\") pod \"kube-apiserver-proxy-ip-10-0-138-4.ec2.internal\" (UID: \"53a487abe77507ab89cb5cf1017a5b1f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.574923 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.574901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c9000e672991f209623b410921cb239-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal\" (UID: \"9c9000e672991f209623b410921cb239\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.575016 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.574928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/53a487abe77507ab89cb5cf1017a5b1f-config\") pod \"kube-apiserver-proxy-ip-10-0-138-4.ec2.internal\" (UID: \"53a487abe77507ab89cb5cf1017a5b1f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.575016 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.574945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c9000e672991f209623b410921cb239-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal\" (UID: \"9c9000e672991f209623b410921cb239\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.575016 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.574975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c9000e672991f209623b410921cb239-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal\" (UID: \"9c9000e672991f209623b410921cb239\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.575016 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.574989 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c9000e672991f209623b410921cb239-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal\" (UID: \"9c9000e672991f209623b410921cb239\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.575016 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.575000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/53a487abe77507ab89cb5cf1017a5b1f-config\") pod \"kube-apiserver-proxy-ip-10-0-138-4.ec2.internal\" (UID: \"53a487abe77507ab89cb5cf1017a5b1f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.716493 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.716417 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" Apr 20 07:50:07.721847 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:07.721831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" Apr 20 07:50:08.070469 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.070451 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 07:50:08.071030 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.070580 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:50:08.071030 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.070598 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:50:08.071030 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.070621 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:50:08.147639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.147611 2569 apiserver.go:52] "Watching apiserver" Apr 20 07:50:08.153631 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.153605 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 07:50:08.153946 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.153926 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-w9j66","openshift-image-registry/node-ca-khs8v","openshift-multus/multus-qh227","openshift-multus/network-metrics-daemon-m5qfv","openshift-network-diagnostics/network-check-target-vmv56","openshift-network-operator/iptables-alerter-s26s7","kube-system/konnectivity-agent-s2r4p","kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88","openshift-dns/node-resolver-jd8vj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal","openshift-multus/multus-additional-cni-plugins-gs6zc","openshift-ovn-kubernetes/ovnkube-node-kqpbd"] Apr 20 07:50:08.155039 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.155009 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 07:45:07 +0000 UTC" deadline="2027-11-10 11:12:09.667130513 +0000 UTC" Apr 20 07:50:08.155098 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.155040 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13659h22m1.512093677s" Apr 20 07:50:08.156648 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.156625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.157870 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.157848 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.157965 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.157920 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qh227" Apr 20 07:50:08.158403 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.158381 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.158498 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.158425 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.158498 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.158425 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ssw4t\"" Apr 20 07:50:08.160121 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.159605 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 07:50:08.160121 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.159638 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5fsx7\"" Apr 20 07:50:08.160121 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.159694 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.160121 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.159793 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.160416 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160400 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.160474 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160427 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 07:50:08.160474 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160445 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 07:50:08.161084 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.161084 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:08.161084 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.160812 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:08.161084 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.160870 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:08.161084 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160704 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.161084 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.160959 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5sfpv\"" Apr 20 07:50:08.162629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.162605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.163671 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.163655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.163933 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.163917 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.164064 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.164048 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tx4bb\"" Apr 20 07:50:08.164116 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.164094 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 07:50:08.164299 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.164284 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.164902 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.164886 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.165391 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.165375 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j887h\"" Apr 20 07:50:08.165479 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.165378 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 07:50:08.165479 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.165425 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 07:50:08.166077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.166065 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.166429 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.166413 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.166509 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.166468 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.166509 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.166494 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 07:50:08.166593 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.166514 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nhvx4\"" Apr 20 07:50:08.167491 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.167472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.167700 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.167660 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.167873 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.167858 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.167910 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.167887 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-m258s\"" Apr 20 07:50:08.168925 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.168909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4grwr\"" Apr 20 07:50:08.169026 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.169011 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 07:50:08.169110 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.169095 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 07:50:08.169162 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.169098 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.170466 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170451 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 07:50:08.170548 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170515 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 07:50:08.170651 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170637 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sj9vt\"" Apr 20 07:50:08.171234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170866 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 07:50:08.171234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170879 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 07:50:08.171234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170899 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 07:50:08.171234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170913 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 07:50:08.171234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.170922 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 07:50:08.172786 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.172769 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 07:50:08.176916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.176894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-systemd\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.177011 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.176917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c2b7af6-350a-4223-8075-1a8760e67c96-hosts-file\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.177011 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.176933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-system-cni-dir\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.177011 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.176950 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.177011 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.176968 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-os-release\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177011 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.176989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-multus-certs\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-log-socket\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-cni-bin\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-cni-binary-copy\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-system-cni-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-cni-netd\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177230 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-cni-binary-copy\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177257 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw6zr\" (UniqueName: \"kubernetes.io/projected/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-kube-api-access-zw6zr\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv8nm\" (UniqueName: \"kubernetes.io/projected/a63eb0b8-9da1-4db3-8e08-97401e379909-kube-api-access-tv8nm\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znx56\" (UniqueName: \"kubernetes.io/projected/f5d50f48-c1cd-490d-8f78-48a66378ab3a-kube-api-access-znx56\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-kubelet\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/75f0e0ec-e11d-4c52-b6bd-ec1da8086f15-agent-certs\") pod \"konnectivity-agent-s2r4p\" (UID: \"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15\") " pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177414 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-kubelet\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-ovnkube-script-lib\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-socket-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.177527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177521 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjjl\" (UniqueName: \"kubernetes.io/projected/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-kube-api-access-gxjjl\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t6fl\" (UniqueName: \"kubernetes.io/projected/872f16e1-a280-4e38-b34a-f24ffef351d3-kube-api-access-4t6fl\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-cnibin\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-hostroot\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-slash\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177675 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/872f16e1-a280-4e38-b34a-f24ffef351d3-ovn-node-metrics-cert\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177720 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d50f48-c1cd-490d-8f78-48a66378ab3a-host\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-etc-kubernetes\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-var-lib-kubelet\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-conf-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-daemon-config\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/75f0e0ec-e11d-4c52-b6bd-ec1da8086f15-konnectivity-ca\") pod \"konnectivity-agent-s2r4p\" (UID: \"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15\") " pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177879 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-etc-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177903 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.177987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177944 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pj27\" (UniqueName: \"kubernetes.io/projected/8c2b7af6-350a-4223-8075-1a8760e67c96-kube-api-access-2pj27\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.177969 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc5jw\" (UniqueName: \"kubernetes.io/projected/ea6db407-9937-4b0f-84e4-91f5c10786a5-kube-api-access-jc5jw\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-sys\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnkh6\" (UniqueName: \"kubernetes.io/projected/acd15161-0ff9-49b5-a6ea-f34970649228-kube-api-access-nnkh6\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178035 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5d50f48-c1cd-490d-8f78-48a66378ab3a-serviceca\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-socket-dir-parent\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-cni-multus\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178085 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysctl-conf\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-k8s-cni-cncf-io\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178115 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-systemd-units\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-systemd\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178151 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-var-lib-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-env-overrides\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-run\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w6kp\" (UniqueName: \"kubernetes.io/projected/0e96090b-285a-4c1b-98c7-6793626b3969-kube-api-access-5w6kp\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178235 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-device-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.178622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178286 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-cnibin\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-kubernetes\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-lib-modules\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178351 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-host-slash\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-etc-selinux\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysconfig\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/acd15161-0ff9-49b5-a6ea-f34970649228-tmp\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-cni-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178430 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-cni-bin\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-ovn\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178483 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-node-log\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-registration-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178513 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-sys-fs\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178544 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-host\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178568 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-ovnkube-config\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-modprobe-d\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.178610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysctl-d\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.179059 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-iptables-alerter-script\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.179096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8c2b7af6-350a-4223-8075-1a8760e67c96-tmp-dir\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.179124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-os-release\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.179138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/acd15161-0ff9-49b5-a6ea-f34970649228-etc-tuned\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.179152 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-netns\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.179564 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.179176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-run-netns\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.180260 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.180241 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:50:08.204613 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.204591 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9h5c4" Apr 20 07:50:08.214499 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.214482 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9h5c4" Apr 20 07:50:08.237563 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.237342 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a487abe77507ab89cb5cf1017a5b1f.slice/crio-5d35388373636e21a1e4087481b88544954f7c406aa0f7f10995612df371f41a WatchSource:0}: Error finding container 5d35388373636e21a1e4087481b88544954f7c406aa0f7f10995612df371f41a: Status 404 returned error can't find the container with id 5d35388373636e21a1e4087481b88544954f7c406aa0f7f10995612df371f41a Apr 20 07:50:08.237876 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.237859 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c9000e672991f209623b410921cb239.slice/crio-42e2ff11a41f97366a8b61e5c66c423ad53a89bf2c53985c092a8451a1161cba WatchSource:0}: Error finding container 42e2ff11a41f97366a8b61e5c66c423ad53a89bf2c53985c092a8451a1161cba: Status 404 returned error can't find the container with id 42e2ff11a41f97366a8b61e5c66c423ad53a89bf2c53985c092a8451a1161cba Apr 20 07:50:08.242997 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.242981 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:50:08.277883 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.277851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" event={"ID":"53a487abe77507ab89cb5cf1017a5b1f","Type":"ContainerStarted","Data":"5d35388373636e21a1e4087481b88544954f7c406aa0f7f10995612df371f41a"} Apr 20 07:50:08.278884 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.278864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" event={"ID":"9c9000e672991f209623b410921cb239","Type":"ContainerStarted","Data":"42e2ff11a41f97366a8b61e5c66c423ad53a89bf2c53985c092a8451a1161cba"} Apr 20 07:50:08.280023 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-host-slash\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.280080 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-etc-selinux\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.280080 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280045 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysconfig\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.280080 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/acd15161-0ff9-49b5-a6ea-f34970649228-tmp\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.280233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-cni-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.280233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-host-slash\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.280233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-etc-selinux\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.280233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-cni-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.280233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysconfig\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.280233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-cni-bin\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280243 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-cni-bin\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-ovn\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-node-log\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-node-log\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-registration-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.280418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-ovn\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.280728 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-registration-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.280728 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-sys-fs\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.280728 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280415 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 07:50:08.280851 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280761 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-sys-fs\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.280851 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280757 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.280851 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-host\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.280851 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-ovnkube-config\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.281043 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-modprobe-d\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.281043 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.280971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysctl-d\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.281043 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281003 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-iptables-alerter-script\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.281043 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281028 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8c2b7af6-350a-4223-8075-1a8760e67c96-tmp-dir\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.281238 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-os-release\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.281238 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/acd15161-0ff9-49b5-a6ea-f34970649228-etc-tuned\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.281238 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281158 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-netns\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.281238 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-run-netns\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.281238 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-systemd\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.281472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c2b7af6-350a-4223-8075-1a8760e67c96-hosts-file\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.281472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-system-cni-dir\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.281472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.281472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281380 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.281472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281397 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-os-release\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.281472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-os-release\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-systemd\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281498 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-os-release\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-multus-certs\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-log-socket\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-cni-bin\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-cni-binary-copy\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c2b7af6-350a-4223-8075-1a8760e67c96-hosts-file\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281656 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-system-cni-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-cni-netd\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.281714 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-system-cni-dir\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-cni-binary-copy\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw6zr\" (UniqueName: \"kubernetes.io/projected/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-kube-api-access-zw6zr\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv8nm\" (UniqueName: \"kubernetes.io/projected/a63eb0b8-9da1-4db3-8e08-97401e379909-kube-api-access-tv8nm\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znx56\" (UniqueName: \"kubernetes.io/projected/f5d50f48-c1cd-490d-8f78-48a66378ab3a-kube-api-access-znx56\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281900 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-kubelet\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281925 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/75f0e0ec-e11d-4c52-b6bd-ec1da8086f15-agent-certs\") pod \"konnectivity-agent-s2r4p\" (UID: \"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15\") " pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-kubelet\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.281987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-ovnkube-script-lib\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-socket-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-ovnkube-config\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjjl\" (UniqueName: \"kubernetes.io/projected/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-kube-api-access-gxjjl\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282096 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t6fl\" (UniqueName: \"kubernetes.io/projected/872f16e1-a280-4e38-b34a-f24ffef351d3-kube-api-access-4t6fl\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-log-socket\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-cnibin\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-cni-bin\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282202 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-iptables-alerter-script\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-hostroot\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-slash\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-system-cni-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/872f16e1-a280-4e38-b34a-f24ffef351d3-ovn-node-metrics-cert\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-cni-netd\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d50f48-c1cd-490d-8f78-48a66378ab3a-host\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282411 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282421 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-etc-kubernetes\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-var-lib-kubelet\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282516 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-netns\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-conf-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-conf-dir\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.282839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-daemon-config\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/75f0e0ec-e11d-4c52-b6bd-ec1da8086f15-konnectivity-ca\") pod \"konnectivity-agent-s2r4p\" (UID: \"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15\") " pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-etc-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pj27\" (UniqueName: \"kubernetes.io/projected/8c2b7af6-350a-4223-8075-1a8760e67c96-kube-api-access-2pj27\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jc5jw\" (UniqueName: \"kubernetes.io/projected/ea6db407-9937-4b0f-84e4-91f5c10786a5-kube-api-access-jc5jw\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282768 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-sys\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnkh6\" (UniqueName: \"kubernetes.io/projected/acd15161-0ff9-49b5-a6ea-f34970649228-kube-api-access-nnkh6\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5d50f48-c1cd-490d-8f78-48a66378ab3a-serviceca\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-socket-dir-parent\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-cni-binary-copy\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-cni-multus\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-daemon-config\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-multus-certs\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8c2b7af6-350a-4223-8075-1a8760e67c96-tmp-dir\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-kubelet\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-host\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-kubelet\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.283629 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283519 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283587 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-var-lib-kubelet\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ea6db407-9937-4b0f-84e4-91f5c10786a5-cni-binary-copy\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-var-lib-cni-multus\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysctl-conf\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-k8s-cni-cncf-io\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-systemd-units\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-systemd\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.283976 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-var-lib-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-env-overrides\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-ovnkube-script-lib\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-run\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-run\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w6kp\" (UniqueName: \"kubernetes.io/projected/0e96090b-285a-4c1b-98c7-6793626b3969-kube-api-access-5w6kp\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-device-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.284329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284172 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-cnibin\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/acd15161-0ff9-49b5-a6ea-f34970649228-tmp\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284480 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-run-systemd\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysctl-d\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284635 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-kubernetes\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-lib-modules\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-etc-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-lib-modules\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.284835 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.284853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-var-lib-openvswitch\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285042 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.284933 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:08.784892663 +0000 UTC m=+2.081465648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:08.285503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/872f16e1-a280-4e38-b34a-f24ffef351d3-env-overrides\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-run-netns\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-hostroot\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.285503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/75f0e0ec-e11d-4c52-b6bd-ec1da8086f15-konnectivity-ca\") pod \"konnectivity-agent-s2r4p\" (UID: \"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15\") " pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.285684 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285504 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285684 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.285684 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-kubernetes\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.285684 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-cnibin\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.285873 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285758 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5d50f48-c1cd-490d-8f78-48a66378ab3a-serviceca\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.285873 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-device-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.285873 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285824 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-host-slash\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.285873 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285846 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-sys\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.286045 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5d50f48-c1cd-490d-8f78-48a66378ab3a-host\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.286045 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-etc-kubernetes\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.286045 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.285925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/872f16e1-a280-4e38-b34a-f24ffef351d3-systemd-units\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.286181 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.286072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-host-run-k8s-cni-cncf-io\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.286181 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.286086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-sysctl-conf\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.286181 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.286115 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-multus-socket-dir-parent\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.286181 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.282202 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-cnibin\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.286457 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.286240 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/acd15161-0ff9-49b5-a6ea-f34970649228-etc-modprobe-d\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.286457 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.286276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ea6db407-9937-4b0f-84e4-91f5c10786a5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.286937 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.286894 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a63eb0b8-9da1-4db3-8e08-97401e379909-socket-dir\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.288391 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.288364 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/872f16e1-a280-4e38-b34a-f24ffef351d3-ovn-node-metrics-cert\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.288575 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.288548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/75f0e0ec-e11d-4c52-b6bd-ec1da8086f15-agent-certs\") pod \"konnectivity-agent-s2r4p\" (UID: \"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15\") " pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.288691 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.288666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/acd15161-0ff9-49b5-a6ea-f34970649228-etc-tuned\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.289665 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.289646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw6zr\" (UniqueName: \"kubernetes.io/projected/a87d5cb5-84a7-4b46-9c01-785c30aedcbf-kube-api-access-zw6zr\") pod \"multus-qh227\" (UID: \"a87d5cb5-84a7-4b46-9c01-785c30aedcbf\") " pod="openshift-multus/multus-qh227" Apr 20 07:50:08.289947 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.289927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znx56\" (UniqueName: \"kubernetes.io/projected/f5d50f48-c1cd-490d-8f78-48a66378ab3a-kube-api-access-znx56\") pod \"node-ca-khs8v\" (UID: \"f5d50f48-c1cd-490d-8f78-48a66378ab3a\") " pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.290600 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.290585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w6kp\" (UniqueName: \"kubernetes.io/projected/0e96090b-285a-4c1b-98c7-6793626b3969-kube-api-access-5w6kp\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.293974 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.293955 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:08.293974 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.293977 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:08.294184 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.293990 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c2tkc for pod openshift-network-diagnostics/network-check-target-vmv56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:08.294184 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.294064 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc podName:0a4dfd92-0a59-4e3f-bb86-c3a74ffec631 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:08.794045175 +0000 UTC m=+2.090618152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c2tkc" (UniqueName: "kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc") pod "network-check-target-vmv56" (UID: "0a4dfd92-0a59-4e3f-bb86-c3a74ffec631") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:08.295424 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.295400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pj27\" (UniqueName: \"kubernetes.io/projected/8c2b7af6-350a-4223-8075-1a8760e67c96-kube-api-access-2pj27\") pod \"node-resolver-jd8vj\" (UID: \"8c2b7af6-350a-4223-8075-1a8760e67c96\") " pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.295628 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.295606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnkh6\" (UniqueName: \"kubernetes.io/projected/acd15161-0ff9-49b5-a6ea-f34970649228-kube-api-access-nnkh6\") pod \"tuned-w9j66\" (UID: \"acd15161-0ff9-49b5-a6ea-f34970649228\") " pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.295715 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.295700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc5jw\" (UniqueName: \"kubernetes.io/projected/ea6db407-9937-4b0f-84e4-91f5c10786a5-kube-api-access-jc5jw\") pod \"multus-additional-cni-plugins-gs6zc\" (UID: \"ea6db407-9937-4b0f-84e4-91f5c10786a5\") " pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.295813 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.295796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv8nm\" (UniqueName: \"kubernetes.io/projected/a63eb0b8-9da1-4db3-8e08-97401e379909-kube-api-access-tv8nm\") pod \"aws-ebs-csi-driver-node-8wx88\" (UID: \"a63eb0b8-9da1-4db3-8e08-97401e379909\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.296396 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.296379 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t6fl\" (UniqueName: \"kubernetes.io/projected/872f16e1-a280-4e38-b34a-f24ffef351d3-kube-api-access-4t6fl\") pod \"ovnkube-node-kqpbd\" (UID: \"872f16e1-a280-4e38-b34a-f24ffef351d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.296596 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.296580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjjl\" (UniqueName: \"kubernetes.io/projected/d7076af5-74cf-4fa6-ac90-de8f6cc674e4-kube-api-access-gxjjl\") pod \"iptables-alerter-s26s7\" (UID: \"d7076af5-74cf-4fa6-ac90-de8f6cc674e4\") " pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.487831 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.487727 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w9j66" Apr 20 07:50:08.493635 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.493601 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacd15161_0ff9_49b5_a6ea_f34970649228.slice/crio-f30c824d4a49a929fd1c7fa39e13425101adbcbade818b1e1c24d14b58ee636e WatchSource:0}: Error finding container f30c824d4a49a929fd1c7fa39e13425101adbcbade818b1e1c24d14b58ee636e: Status 404 returned error can't find the container with id f30c824d4a49a929fd1c7fa39e13425101adbcbade818b1e1c24d14b58ee636e Apr 20 07:50:08.501422 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.501401 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-khs8v" Apr 20 07:50:08.507853 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.507823 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d50f48_c1cd_490d_8f78_48a66378ab3a.slice/crio-370ee6c91cf73157a691fb6c7ac422ab53db31f52493f1f24322671a16854e59 WatchSource:0}: Error finding container 370ee6c91cf73157a691fb6c7ac422ab53db31f52493f1f24322671a16854e59: Status 404 returned error can't find the container with id 370ee6c91cf73157a691fb6c7ac422ab53db31f52493f1f24322671a16854e59 Apr 20 07:50:08.508484 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.508466 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qh227" Apr 20 07:50:08.522994 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.522973 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s26s7" Apr 20 07:50:08.528544 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.528518 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7076af5_74cf_4fa6_ac90_de8f6cc674e4.slice/crio-73c418cffc45efd250af91c315033869d692749fba3846e44f7ce68e11d871e7 WatchSource:0}: Error finding container 73c418cffc45efd250af91c315033869d692749fba3846e44f7ce68e11d871e7: Status 404 returned error can't find the container with id 73c418cffc45efd250af91c315033869d692749fba3846e44f7ce68e11d871e7 Apr 20 07:50:08.547792 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.547763 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:08.552979 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.552960 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" Apr 20 07:50:08.553238 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.553194 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f0e0ec_e11d_4c52_b6bd_ec1da8086f15.slice/crio-59b598e47293430e3c471fde9f0f77f202a8bf70f20cfabedd1ef0840e6f27c1 WatchSource:0}: Error finding container 59b598e47293430e3c471fde9f0f77f202a8bf70f20cfabedd1ef0840e6f27c1: Status 404 returned error can't find the container with id 59b598e47293430e3c471fde9f0f77f202a8bf70f20cfabedd1ef0840e6f27c1 Apr 20 07:50:08.558628 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.558610 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jd8vj" Apr 20 07:50:08.559175 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.559154 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63eb0b8_9da1_4db3_8e08_97401e379909.slice/crio-ed9939783a6b50fccf508875594df33e23ca226b794bfd65c90816fbf79d2a6d WatchSource:0}: Error finding container ed9939783a6b50fccf508875594df33e23ca226b794bfd65c90816fbf79d2a6d: Status 404 returned error can't find the container with id ed9939783a6b50fccf508875594df33e23ca226b794bfd65c90816fbf79d2a6d Apr 20 07:50:08.564004 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.563974 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" Apr 20 07:50:08.565975 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.565948 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2b7af6_350a_4223_8075_1a8760e67c96.slice/crio-42a901cfd6f95b3bc789baffe7d14198d6b53db0bb365fd0acdee73fe3c01332 WatchSource:0}: Error finding container 42a901cfd6f95b3bc789baffe7d14198d6b53db0bb365fd0acdee73fe3c01332: Status 404 returned error can't find the container with id 42a901cfd6f95b3bc789baffe7d14198d6b53db0bb365fd0acdee73fe3c01332 Apr 20 07:50:08.567898 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.567870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:08.571316 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.571293 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6db407_9937_4b0f_84e4_91f5c10786a5.slice/crio-db4e6bc7d4418505dc9da733a6e7902395b4a188707480f96bf16446a1e272b9 WatchSource:0}: Error finding container db4e6bc7d4418505dc9da733a6e7902395b4a188707480f96bf16446a1e272b9: Status 404 returned error can't find the container with id db4e6bc7d4418505dc9da733a6e7902395b4a188707480f96bf16446a1e272b9 Apr 20 07:50:08.575590 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:08.575568 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872f16e1_a280_4e38_b34a_f24ffef351d3.slice/crio-739d17c05fbf2827e5746b51f4d5e2e765a681477bafc660673927f3f7f744db WatchSource:0}: Error finding container 739d17c05fbf2827e5746b51f4d5e2e765a681477bafc660673927f3f7f744db: Status 404 returned error can't find the container with id 739d17c05fbf2827e5746b51f4d5e2e765a681477bafc660673927f3f7f744db Apr 20 07:50:08.788943 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.788913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:08.789099 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.789073 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:08.789167 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.789149 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:09.789128669 +0000 UTC m=+3.085701647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:08.890575 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:08.889949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:08.890575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.890121 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:08.890575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.890140 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:08.890575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.890153 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c2tkc for pod openshift-network-diagnostics/network-check-target-vmv56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:08.890575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:08.890231 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc podName:0a4dfd92-0a59-4e3f-bb86-c3a74ffec631 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:09.890198945 +0000 UTC m=+3.186771913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c2tkc" (UniqueName: "kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc") pod "network-check-target-vmv56" (UID: "0a4dfd92-0a59-4e3f-bb86-c3a74ffec631") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:09.000334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.000002 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:09.037468 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.037283 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:09.216055 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.215968 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 07:45:08 +0000 UTC" deadline="2027-10-25 11:28:09.819917012 +0000 UTC" Apr 20 07:50:09.216055 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.216006 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13275h38m0.60391482s" Apr 20 07:50:09.301722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.301682 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:09.309568 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.309534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" event={"ID":"a63eb0b8-9da1-4db3-8e08-97401e379909","Type":"ContainerStarted","Data":"ed9939783a6b50fccf508875594df33e23ca226b794bfd65c90816fbf79d2a6d"} Apr 20 07:50:09.315995 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.315944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2r4p" event={"ID":"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15","Type":"ContainerStarted","Data":"59b598e47293430e3c471fde9f0f77f202a8bf70f20cfabedd1ef0840e6f27c1"} Apr 20 07:50:09.321483 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.321437 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qh227" event={"ID":"a87d5cb5-84a7-4b46-9c01-785c30aedcbf","Type":"ContainerStarted","Data":"10bad4542df5d1e6d042dd6a2fa53bf665504aa21d93b3de8a2310f1247da59d"} Apr 20 07:50:09.327780 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.327756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w9j66" event={"ID":"acd15161-0ff9-49b5-a6ea-f34970649228","Type":"ContainerStarted","Data":"f30c824d4a49a929fd1c7fa39e13425101adbcbade818b1e1c24d14b58ee636e"} Apr 20 07:50:09.343734 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.343667 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerStarted","Data":"db4e6bc7d4418505dc9da733a6e7902395b4a188707480f96bf16446a1e272b9"} Apr 20 07:50:09.354234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.350366 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jd8vj" event={"ID":"8c2b7af6-350a-4223-8075-1a8760e67c96","Type":"ContainerStarted","Data":"42a901cfd6f95b3bc789baffe7d14198d6b53db0bb365fd0acdee73fe3c01332"} Apr 20 07:50:09.367394 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.367364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s26s7" event={"ID":"d7076af5-74cf-4fa6-ac90-de8f6cc674e4","Type":"ContainerStarted","Data":"73c418cffc45efd250af91c315033869d692749fba3846e44f7ce68e11d871e7"} Apr 20 07:50:09.374863 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.374325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-khs8v" event={"ID":"f5d50f48-c1cd-490d-8f78-48a66378ab3a","Type":"ContainerStarted","Data":"370ee6c91cf73157a691fb6c7ac422ab53db31f52493f1f24322671a16854e59"} Apr 20 07:50:09.380677 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.380418 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"739d17c05fbf2827e5746b51f4d5e2e765a681477bafc660673927f3f7f744db"} Apr 20 07:50:09.798459 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.798419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:09.798657 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:09.798580 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:09.798657 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:09.798644 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:11.798623421 +0000 UTC m=+5.095196386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:09.899911 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:09.899855 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:09.900073 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:09.899993 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:09.900073 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:09.900014 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:09.900073 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:09.900027 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c2tkc for pod openshift-network-diagnostics/network-check-target-vmv56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:09.900361 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:09.900086 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc podName:0a4dfd92-0a59-4e3f-bb86-c3a74ffec631 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:11.900064561 +0000 UTC m=+5.196637537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c2tkc" (UniqueName: "kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc") pod "network-check-target-vmv56" (UID: "0a4dfd92-0a59-4e3f-bb86-c3a74ffec631") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:10.217218 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:10.217073 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 07:45:08 +0000 UTC" deadline="2028-01-17 04:46:31.110524958 +0000 UTC" Apr 20 07:50:10.217218 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:10.217110 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15284h56m20.893418225s" Apr 20 07:50:10.276179 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:10.276140 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:10.276366 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:10.276274 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:10.276744 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:10.276723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:10.276855 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:10.276834 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:11.814490 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:11.814451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:11.814958 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:11.814641 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:11.814958 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:11.814708 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:15.814690062 +0000 UTC m=+9.111263025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:11.915397 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:11.915342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:11.915591 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:11.915504 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:11.915591 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:11.915526 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:11.915591 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:11.915539 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c2tkc for pod openshift-network-diagnostics/network-check-target-vmv56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:11.915748 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:11.915595 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc podName:0a4dfd92-0a59-4e3f-bb86-c3a74ffec631 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:15.915576218 +0000 UTC m=+9.212149180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c2tkc" (UniqueName: "kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc") pod "network-check-target-vmv56" (UID: "0a4dfd92-0a59-4e3f-bb86-c3a74ffec631") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:12.275947 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:12.275918 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:12.276114 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:12.276042 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:12.276177 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:12.276149 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:12.276300 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:12.276275 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:14.276938 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:14.276633 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:14.276938 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:14.276824 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:14.276938 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:14.276866 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:14.276938 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:14.276908 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:15.850135 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:15.849761 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:15.850135 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:15.849940 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:15.850135 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:15.850006 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:23.849987312 +0000 UTC m=+17.146560278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:15.950424 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:15.950392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:15.950603 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:15.950587 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:15.950649 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:15.950614 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:15.950649 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:15.950628 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c2tkc for pod openshift-network-diagnostics/network-check-target-vmv56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:15.950719 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:15.950694 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc podName:0a4dfd92-0a59-4e3f-bb86-c3a74ffec631 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:23.950674807 +0000 UTC m=+17.247247784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c2tkc" (UniqueName: "kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc") pod "network-check-target-vmv56" (UID: "0a4dfd92-0a59-4e3f-bb86-c3a74ffec631") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:16.276220 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.276180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:16.276392 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.276190 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:16.276392 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:16.276302 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:16.276516 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:16.276397 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:16.893680 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.893596 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-r4n6p"] Apr 20 07:50:16.896671 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.896648 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:16.896799 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:16.896728 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:16.957585 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.957537 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d06eb14c-741b-46bf-aada-fd390434ddfd-kubelet-config\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:16.957775 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.957597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d06eb14c-741b-46bf-aada-fd390434ddfd-dbus\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:16.957775 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:16.957631 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.058255 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:17.058202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d06eb14c-741b-46bf-aada-fd390434ddfd-dbus\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.058450 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:17.058280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.058450 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:17.058297 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d06eb14c-741b-46bf-aada-fd390434ddfd-dbus\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.058450 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:17.058363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d06eb14c-741b-46bf-aada-fd390434ddfd-kubelet-config\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.058450 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:17.058398 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:17.058450 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:17.058447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d06eb14c-741b-46bf-aada-fd390434ddfd-kubelet-config\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.058702 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:17.058453 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret podName:d06eb14c-741b-46bf-aada-fd390434ddfd nodeName:}" failed. No retries permitted until 2026-04-20 07:50:17.558434906 +0000 UTC m=+10.855007885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret") pod "global-pull-secret-syncer-r4n6p" (UID: "d06eb14c-741b-46bf-aada-fd390434ddfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:17.562814 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:17.562777 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:17.563018 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:17.562940 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:17.563018 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:17.563017 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret podName:d06eb14c-741b-46bf-aada-fd390434ddfd nodeName:}" failed. No retries permitted until 2026-04-20 07:50:18.562997117 +0000 UTC m=+11.859570085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret") pod "global-pull-secret-syncer-r4n6p" (UID: "d06eb14c-741b-46bf-aada-fd390434ddfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:18.276600 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:18.276566 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:18.276985 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:18.276569 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:18.276985 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:18.276684 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:18.276985 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:18.276565 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:18.276985 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:18.276764 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:18.276985 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:18.276880 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:18.570841 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:18.570806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:18.571023 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:18.570981 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:18.571077 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:18.571051 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret podName:d06eb14c-741b-46bf-aada-fd390434ddfd nodeName:}" failed. No retries permitted until 2026-04-20 07:50:20.571031763 +0000 UTC m=+13.867604728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret") pod "global-pull-secret-syncer-r4n6p" (UID: "d06eb14c-741b-46bf-aada-fd390434ddfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:20.276092 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:20.276056 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:20.276603 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:20.276056 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:20.276603 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:20.276177 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:20.276603 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:20.276273 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:20.276603 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:20.276063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:20.276603 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:20.276385 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:20.587183 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:20.587100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:20.587347 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:20.587235 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:20.587347 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:20.587286 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret podName:d06eb14c-741b-46bf-aada-fd390434ddfd nodeName:}" failed. No retries permitted until 2026-04-20 07:50:24.587273755 +0000 UTC m=+17.883846721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret") pod "global-pull-secret-syncer-r4n6p" (UID: "d06eb14c-741b-46bf-aada-fd390434ddfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:22.275821 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:22.275793 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:22.276162 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:22.275801 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:22.276162 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:22.275887 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:22.276162 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:22.275801 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:22.276162 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:22.275966 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:22.276162 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:22.276032 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:23.912195 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:23.912154 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:23.912700 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:23.912339 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:23.912700 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:23.912419 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:39.912398403 +0000 UTC m=+33.208971378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:24.013390 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:24.013350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:24.013575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.013530 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:24.013575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.013550 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:24.013575 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.013563 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c2tkc for pod openshift-network-diagnostics/network-check-target-vmv56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:24.013692 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.013613 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc podName:0a4dfd92-0a59-4e3f-bb86-c3a74ffec631 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.013597878 +0000 UTC m=+33.310170841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c2tkc" (UniqueName: "kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc") pod "network-check-target-vmv56" (UID: "0a4dfd92-0a59-4e3f-bb86-c3a74ffec631") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:24.276687 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:24.276655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:24.276858 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:24.276660 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:24.276858 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.276778 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:24.276858 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:24.276662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:24.277018 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.276853 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:24.277018 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.276926 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:24.617465 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:24.617379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:24.617613 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.617527 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:24.617613 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:24.617597 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret podName:d06eb14c-741b-46bf-aada-fd390434ddfd nodeName:}" failed. No retries permitted until 2026-04-20 07:50:32.617582007 +0000 UTC m=+25.914154974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret") pod "global-pull-secret-syncer-r4n6p" (UID: "d06eb14c-741b-46bf-aada-fd390434ddfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:26.276006 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.275670 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:26.276741 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.275698 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:26.276741 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:26.276097 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:26.276741 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.275732 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:26.276741 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:26.276174 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:26.276741 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:26.276241 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:26.416398 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.416362 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w9j66" event={"ID":"acd15161-0ff9-49b5-a6ea-f34970649228","Type":"ContainerStarted","Data":"e872394e4101a906f8e057c010945489f295bb81784945e382cb1296428ec66e"} Apr 20 07:50:26.419105 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.419054 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" event={"ID":"53a487abe77507ab89cb5cf1017a5b1f","Type":"ContainerStarted","Data":"0c1890c2d6239132e6606cee5fa3dad65b3826fd1a634ba3df9fc48d2619d200"} Apr 20 07:50:26.422437 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422361 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 07:50:26.422713 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422691 2569 generic.go:358] "Generic (PLEG): container finished" podID="872f16e1-a280-4e38-b34a-f24ffef351d3" containerID="bd5b3ecae436dc0edd6314550d44c0ac4c2d1993387086e27c6d426cf88ec3e0" exitCode=1 Apr 20 07:50:26.422791 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422749 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"30f6085b3f51267a09c42548f02e518122808493d962d4a95a6bb53ff1e68d04"} Apr 20 07:50:26.422791 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"1040173da042da8a6c5f7c1e2a36186cfc33efbcbadcda0cb1e13c330dc3bc2f"} Apr 20 07:50:26.422791 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422775 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"c7527e6fddfa9b9f9c5bfcfdea67b6449cd5332fa7bb15b775b83ffaad89231f"} Apr 20 07:50:26.422791 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422783 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"21c3b4078612954ec8c502a4809dba4498b1611c8a553e3759a2993fcac7ed46"} Apr 20 07:50:26.422791 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerDied","Data":"bd5b3ecae436dc0edd6314550d44c0ac4c2d1993387086e27c6d426cf88ec3e0"} Apr 20 07:50:26.423038 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.422799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"edc5055d058888aa36c26f7f6058f80957aef5d91335eb9a7b2835a45591a455"} Apr 20 07:50:26.424145 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.424123 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qh227" event={"ID":"a87d5cb5-84a7-4b46-9c01-785c30aedcbf","Type":"ContainerStarted","Data":"d854ef7a98223a395192972549a0b6cc97959d5da52fce0d34db4f45a893f52f"} Apr 20 07:50:26.445003 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.444952 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-w9j66" podStartSLOduration=2.055784986 podStartE2EDuration="19.444940799s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.495175849 +0000 UTC m=+1.791748814" lastFinishedPulling="2026-04-20 07:50:25.884331665 +0000 UTC m=+19.180904627" observedRunningTime="2026-04-20 07:50:26.43262589 +0000 UTC m=+19.729198885" watchObservedRunningTime="2026-04-20 07:50:26.444940799 +0000 UTC m=+19.741513783" Apr 20 07:50:26.445406 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.445380 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-4.ec2.internal" podStartSLOduration=19.445372471 podStartE2EDuration="19.445372471s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:50:26.444812386 +0000 UTC m=+19.741385371" watchObservedRunningTime="2026-04-20 07:50:26.445372471 +0000 UTC m=+19.741945455" Apr 20 07:50:26.458817 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:26.458781 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qh227" podStartSLOduration=2.032321605 podStartE2EDuration="19.45877243s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.515597714 +0000 UTC m=+1.812170679" lastFinishedPulling="2026-04-20 07:50:25.942048542 +0000 UTC m=+19.238621504" observedRunningTime="2026-04-20 07:50:26.458143619 +0000 UTC m=+19.754716604" watchObservedRunningTime="2026-04-20 07:50:26.45877243 +0000 UTC m=+19.755345414" Apr 20 07:50:27.427829 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.427796 2569 generic.go:358] "Generic (PLEG): container finished" podID="9c9000e672991f209623b410921cb239" containerID="368d0783e2f67e728ff3fafec9355d4f8411c70bce27362c54ddf104ba9f2520" exitCode=0 Apr 20 07:50:27.428235 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.427889 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" event={"ID":"9c9000e672991f209623b410921cb239","Type":"ContainerDied","Data":"368d0783e2f67e728ff3fafec9355d4f8411c70bce27362c54ddf104ba9f2520"} Apr 20 07:50:27.429155 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.429126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" event={"ID":"a63eb0b8-9da1-4db3-8e08-97401e379909","Type":"ContainerStarted","Data":"6a7bc144a31ae29f567162f5dc721c31eb5a5945284c91d60df8b7f5fa2af500"} Apr 20 07:50:27.430724 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.430702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2r4p" event={"ID":"75f0e0ec-e11d-4c52-b6bd-ec1da8086f15","Type":"ContainerStarted","Data":"5bf4de9109bb1c68f0ec1d39db42c12a14b83f597738c6eee41070e28fe4600d"} Apr 20 07:50:27.432188 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.432159 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea6db407-9937-4b0f-84e4-91f5c10786a5" containerID="f0199ba1602ce9f3ab4ed35c7fa01e529fd153f484dc6acdc44a5c8d7666e7c5" exitCode=0 Apr 20 07:50:27.432293 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.432240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerDied","Data":"f0199ba1602ce9f3ab4ed35c7fa01e529fd153f484dc6acdc44a5c8d7666e7c5"} Apr 20 07:50:27.433577 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.433524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jd8vj" event={"ID":"8c2b7af6-350a-4223-8075-1a8760e67c96","Type":"ContainerStarted","Data":"0509fc07b12bccb6dd93583807dbd0e565d044d5e92d170a638ec3d261d79bde"} Apr 20 07:50:27.435013 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.434977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s26s7" event={"ID":"d7076af5-74cf-4fa6-ac90-de8f6cc674e4","Type":"ContainerStarted","Data":"6afd43a19a30a1dea78037677bcfe149caa901229f1f2fe8955c17f489198201"} Apr 20 07:50:27.436280 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.436228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-khs8v" event={"ID":"f5d50f48-c1cd-490d-8f78-48a66378ab3a","Type":"ContainerStarted","Data":"6012c7646b66cd20264cefa12158a2d158f5188435e34fc55df17207d59c7e16"} Apr 20 07:50:27.453667 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.453622 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-khs8v" podStartSLOduration=3.141274536 podStartE2EDuration="20.45360698s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.509953178 +0000 UTC m=+1.806526140" lastFinishedPulling="2026-04-20 07:50:25.822285614 +0000 UTC m=+19.118858584" observedRunningTime="2026-04-20 07:50:27.453351248 +0000 UTC m=+20.749924235" watchObservedRunningTime="2026-04-20 07:50:27.45360698 +0000 UTC m=+20.750179965" Apr 20 07:50:27.465013 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.464971 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s26s7" podStartSLOduration=2.9970611959999998 podStartE2EDuration="20.464949209s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.530238257 +0000 UTC m=+1.826811220" lastFinishedPulling="2026-04-20 07:50:25.998126271 +0000 UTC m=+19.294699233" observedRunningTime="2026-04-20 07:50:27.464551089 +0000 UTC m=+20.761124072" watchObservedRunningTime="2026-04-20 07:50:27.464949209 +0000 UTC m=+20.761522194" Apr 20 07:50:27.476439 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.476397 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jd8vj" podStartSLOduration=3.221856993 podStartE2EDuration="20.47638693s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.567722791 +0000 UTC m=+1.864295756" lastFinishedPulling="2026-04-20 07:50:25.822252718 +0000 UTC m=+19.118825693" observedRunningTime="2026-04-20 07:50:27.475744099 +0000 UTC m=+20.772317083" watchObservedRunningTime="2026-04-20 07:50:27.47638693 +0000 UTC m=+20.772959914" Apr 20 07:50:27.508496 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.507721 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s2r4p" podStartSLOduration=3.23364634 podStartE2EDuration="20.507704487s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.554615142 +0000 UTC m=+1.851188103" lastFinishedPulling="2026-04-20 07:50:25.828673281 +0000 UTC m=+19.125246250" observedRunningTime="2026-04-20 07:50:27.5070362 +0000 UTC m=+20.803609186" watchObservedRunningTime="2026-04-20 07:50:27.507704487 +0000 UTC m=+20.804277472" Apr 20 07:50:27.549721 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:27.549593 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 07:50:28.243322 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.243185 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T07:50:27.54971766Z","UUID":"06b2e275-4224-40e7-bba4-4b5bff1cf59a","Handler":null,"Name":"","Endpoint":""} Apr 20 07:50:28.244904 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.244880 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 07:50:28.244904 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.244910 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 07:50:28.276798 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.276706 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:28.276798 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.276718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:28.276981 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.276710 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:28.276981 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:28.276818 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:28.276981 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:28.276922 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:28.277113 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:28.277015 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:28.440530 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.440497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" event={"ID":"a63eb0b8-9da1-4db3-8e08-97401e379909","Type":"ContainerStarted","Data":"cc2a39b008d7baf65f3d17c18690a29f40b86d1426063bb716012748c196bf58"} Apr 20 07:50:28.440898 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.440539 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" event={"ID":"a63eb0b8-9da1-4db3-8e08-97401e379909","Type":"ContainerStarted","Data":"ab910a9501b9aaf3f58149dc8c030f9fc0a00de6ffa06bca2954aac2d851f740"} Apr 20 07:50:28.442580 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.442199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" event={"ID":"9c9000e672991f209623b410921cb239","Type":"ContainerStarted","Data":"7163a1efac725856649ad3b88656ea1b38057665181e202a240803725d334cc9"} Apr 20 07:50:28.454804 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.454753 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wx88" podStartSLOduration=1.731079394 podStartE2EDuration="21.454741008s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.562635487 +0000 UTC m=+1.859208449" lastFinishedPulling="2026-04-20 07:50:28.286297095 +0000 UTC m=+21.582870063" observedRunningTime="2026-04-20 07:50:28.454383587 +0000 UTC m=+21.750956584" watchObservedRunningTime="2026-04-20 07:50:28.454741008 +0000 UTC m=+21.751313992" Apr 20 07:50:28.465952 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:28.465913 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-4.ec2.internal" podStartSLOduration=21.465901414 podStartE2EDuration="21.465901414s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:50:28.465841425 +0000 UTC m=+21.762414410" watchObservedRunningTime="2026-04-20 07:50:28.465901414 +0000 UTC m=+21.762474398" Apr 20 07:50:29.384234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:29.383952 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:29.384649 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:29.384625 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:29.447366 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:29.447342 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 07:50:29.447877 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:29.447784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"d1be1199329342aaba5a93f36494323ed8af9f41012bba72d4d272135aa93763"} Apr 20 07:50:29.448119 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:29.448096 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:29.448580 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:29.448561 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s2r4p" Apr 20 07:50:30.276410 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:30.276380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:30.276410 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:30.276404 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:30.276652 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:30.276380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:30.276652 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:30.276496 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:30.276652 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:30.276593 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:30.276791 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:30.276689 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:31.454717 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:31.454514 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 07:50:31.455284 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:31.455133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"f8cd4c902e994bd517d0d21abc9bd0aaee3609bf3ba09dfbad5dce7d4f5c3d50"} Apr 20 07:50:31.456283 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:31.455678 2569 scope.go:117] "RemoveContainer" containerID="bd5b3ecae436dc0edd6314550d44c0ac4c2d1993387086e27c6d426cf88ec3e0" Apr 20 07:50:32.276692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.276664 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:32.276848 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.276666 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:32.276848 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:32.276755 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:32.276848 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:32.276831 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:32.276848 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.276666 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:32.276972 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:32.276911 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:32.458303 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.458269 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea6db407-9937-4b0f-84e4-91f5c10786a5" containerID="54e497533d1ca401b9515b3a93db6362f7de3bfe9eed16acb0b3ac6b00f56959" exitCode=0 Apr 20 07:50:32.458747 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.458358 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerDied","Data":"54e497533d1ca401b9515b3a93db6362f7de3bfe9eed16acb0b3ac6b00f56959"} Apr 20 07:50:32.461675 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.461659 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 07:50:32.462002 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.461983 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" event={"ID":"872f16e1-a280-4e38-b34a-f24ffef351d3","Type":"ContainerStarted","Data":"0433cabe256e1ee82924819ce500f76ed722444a4d07435f768014b0c2229528"} Apr 20 07:50:32.462334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.462311 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:32.462440 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.462343 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:32.462440 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.462358 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:32.476500 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.476481 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:32.476602 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.476563 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:50:32.499826 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.499791 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" podStartSLOduration=8.193672111 podStartE2EDuration="25.499780751s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.57735007 +0000 UTC m=+1.873923037" lastFinishedPulling="2026-04-20 07:50:25.883458708 +0000 UTC m=+19.180031677" observedRunningTime="2026-04-20 07:50:32.499396659 +0000 UTC m=+25.795969655" watchObservedRunningTime="2026-04-20 07:50:32.499780751 +0000 UTC m=+25.796353735" Apr 20 07:50:32.676750 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:32.676671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:32.676884 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:32.676811 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:32.676884 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:32.676871 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret podName:d06eb14c-741b-46bf-aada-fd390434ddfd nodeName:}" failed. No retries permitted until 2026-04-20 07:50:48.676856012 +0000 UTC m=+41.973428974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret") pod "global-pull-secret-syncer-r4n6p" (UID: "d06eb14c-741b-46bf-aada-fd390434ddfd") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:33.465907 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.465687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerStarted","Data":"74a5257039c6eb460568961d2cbe16533005f4a1cf6ae06b399a167bacbdf20e"} Apr 20 07:50:33.544672 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.544638 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r4n6p"] Apr 20 07:50:33.544789 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.544778 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:33.544913 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:33.544888 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:33.547333 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.547303 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m5qfv"] Apr 20 07:50:33.547451 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.547424 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:33.547608 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:33.547561 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:33.548054 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.548034 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vmv56"] Apr 20 07:50:33.548147 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:33.548133 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:33.548259 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:33.548239 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:34.469027 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:34.468999 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea6db407-9937-4b0f-84e4-91f5c10786a5" containerID="74a5257039c6eb460568961d2cbe16533005f4a1cf6ae06b399a167bacbdf20e" exitCode=0 Apr 20 07:50:34.469395 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:34.469089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerDied","Data":"74a5257039c6eb460568961d2cbe16533005f4a1cf6ae06b399a167bacbdf20e"} Apr 20 07:50:35.276403 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:35.276375 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:35.276580 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:35.276378 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:35.276580 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:35.276474 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:35.276703 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:35.276576 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:35.276703 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:35.276384 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:35.276703 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:35.276674 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:35.473064 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:35.473035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerStarted","Data":"e6e4bcaef57e92bffe359a692649d503185ecf181954ab1076ef0c626dc7c943"} Apr 20 07:50:36.477474 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:36.477442 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea6db407-9937-4b0f-84e4-91f5c10786a5" containerID="e6e4bcaef57e92bffe359a692649d503185ecf181954ab1076ef0c626dc7c943" exitCode=0 Apr 20 07:50:36.477855 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:36.477511 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerDied","Data":"e6e4bcaef57e92bffe359a692649d503185ecf181954ab1076ef0c626dc7c943"} Apr 20 07:50:37.276916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:37.276889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:37.277191 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:37.276890 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:37.277316 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:37.276898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:37.277468 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:37.277436 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:50:37.277608 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:37.277582 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r4n6p" podUID="d06eb14c-741b-46bf-aada-fd390434ddfd" Apr 20 07:50:37.277697 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:37.277676 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vmv56" podUID="0a4dfd92-0a59-4e3f-bb86-c3a74ffec631" Apr 20 07:50:38.556584 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.556371 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-4.ec2.internal" event="NodeReady" Apr 20 07:50:38.556998 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.556686 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 07:50:38.597646 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.597619 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6pmzx"] Apr 20 07:50:38.636007 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.635978 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gg5hx"] Apr 20 07:50:38.636233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.636153 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.638063 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.638039 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 07:50:38.638190 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.638151 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvf6t\"" Apr 20 07:50:38.638264 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.638050 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 07:50:38.651470 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.651421 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6pmzx"] Apr 20 07:50:38.651470 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.651443 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gg5hx"] Apr 20 07:50:38.651599 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.651564 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:38.653372 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.653346 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 07:50:38.653694 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.653680 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mxq77\"" Apr 20 07:50:38.653875 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.653862 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 07:50:38.654127 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.654069 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 07:50:38.722820 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.722783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dccw\" (UniqueName: \"kubernetes.io/projected/19f51839-0090-41a8-b3ef-00a1ee0ca874-kube-api-access-2dccw\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.722987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.722923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19f51839-0090-41a8-b3ef-00a1ee0ca874-config-volume\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.723051 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.722993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.723051 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.723011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19f51839-0090-41a8-b3ef-00a1ee0ca874-tmp-dir\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.823646 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.823611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dccw\" (UniqueName: \"kubernetes.io/projected/19f51839-0090-41a8-b3ef-00a1ee0ca874-kube-api-access-2dccw\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.823813 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.823664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:38.823813 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.823772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqmtg\" (UniqueName: \"kubernetes.io/projected/5ba51e49-9c17-47c6-813d-05581eece4d6-kube-api-access-mqmtg\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:38.823924 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.823834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19f51839-0090-41a8-b3ef-00a1ee0ca874-config-volume\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.823924 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.823905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.824031 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.823928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19f51839-0090-41a8-b3ef-00a1ee0ca874-tmp-dir\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.824085 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:38.824030 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:38.824135 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:38.824094 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:39.324074003 +0000 UTC m=+32.620646973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:50:38.824314 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.824296 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19f51839-0090-41a8-b3ef-00a1ee0ca874-tmp-dir\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.824556 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.824534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19f51839-0090-41a8-b3ef-00a1ee0ca874-config-volume\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.833926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.833902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dccw\" (UniqueName: \"kubernetes.io/projected/19f51839-0090-41a8-b3ef-00a1ee0ca874-kube-api-access-2dccw\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:38.925143 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.925028 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:38.925143 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.925103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqmtg\" (UniqueName: \"kubernetes.io/projected/5ba51e49-9c17-47c6-813d-05581eece4d6-kube-api-access-mqmtg\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:38.925383 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:38.925200 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:38.925383 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:38.925295 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:39.42527578 +0000 UTC m=+32.721848741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:50:38.935404 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:38.935379 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqmtg\" (UniqueName: \"kubernetes.io/projected/5ba51e49-9c17-47c6-813d-05581eece4d6-kube-api-access-mqmtg\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:39.276489 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.276455 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:39.276647 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.276455 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:39.276704 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.276455 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:39.278741 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.278720 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:50:39.278881 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.278819 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2g69h\"" Apr 20 07:50:39.278881 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.278828 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:50:39.279094 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.279069 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x6tc5\"" Apr 20 07:50:39.279094 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.279077 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:50:39.279094 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.279090 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 07:50:39.327605 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.327581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:39.327745 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:39.327729 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:39.327833 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:39.327791 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.327772916 +0000 UTC m=+33.624345878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:50:39.427937 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.427898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:39.428103 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:39.428016 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:39.428103 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:39.428077 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.428060297 +0000 UTC m=+33.724633260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:50:39.933348 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:39.933309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:50:39.933806 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:39.933500 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:50:39.933806 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:39.933582 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:11.933560933 +0000 UTC m=+65.230133907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : secret "metrics-daemon-secret" not found Apr 20 07:50:40.034334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:40.034290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:40.037429 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:40.037403 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tkc\" (UniqueName: \"kubernetes.io/projected/0a4dfd92-0a59-4e3f-bb86-c3a74ffec631-kube-api-access-c2tkc\") pod \"network-check-target-vmv56\" (UID: \"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631\") " pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:40.195835 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:40.195755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:40.336782 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:40.336743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:40.336956 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:40.336920 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:40.337004 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:40.336988 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:42.336972049 +0000 UTC m=+35.633545010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:50:40.438099 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:40.438050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:40.438296 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:40.438196 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:40.438296 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:40.438275 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:42.438256779 +0000 UTC m=+35.734829759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:50:42.068276 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:42.068242 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vmv56"] Apr 20 07:50:42.110839 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:42.110801 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a4dfd92_0a59_4e3f_bb86_c3a74ffec631.slice/crio-ed9df28118d7567544d1aa7278a3d198c03019aee27c8237c7bafff2ad677d80 WatchSource:0}: Error finding container ed9df28118d7567544d1aa7278a3d198c03019aee27c8237c7bafff2ad677d80: Status 404 returned error can't find the container with id ed9df28118d7567544d1aa7278a3d198c03019aee27c8237c7bafff2ad677d80 Apr 20 07:50:42.354744 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:42.354508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:42.354913 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:42.354656 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:42.354913 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:42.354888 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:46.354867515 +0000 UTC m=+39.651440480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:50:42.455624 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:42.455555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:42.455732 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:42.455672 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:42.455732 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:42.455723 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:46.455707207 +0000 UTC m=+39.752280168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:50:42.491252 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:42.491223 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea6db407-9937-4b0f-84e4-91f5c10786a5" containerID="d4e53611cbd94d16b52703a77c4d563ac33327283ae7be4cab41ba670ce742f2" exitCode=0 Apr 20 07:50:42.491403 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:42.491296 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerDied","Data":"d4e53611cbd94d16b52703a77c4d563ac33327283ae7be4cab41ba670ce742f2"} Apr 20 07:50:42.492278 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:42.492260 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vmv56" event={"ID":"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631","Type":"ContainerStarted","Data":"ed9df28118d7567544d1aa7278a3d198c03019aee27c8237c7bafff2ad677d80"} Apr 20 07:50:43.500045 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:43.500013 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea6db407-9937-4b0f-84e4-91f5c10786a5" containerID="0e7a5cdef280bab47d247e4459528abdfc4f076bb405d5fe7757fb87618defdf" exitCode=0 Apr 20 07:50:43.500499 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:43.500055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerDied","Data":"0e7a5cdef280bab47d247e4459528abdfc4f076bb405d5fe7757fb87618defdf"} Apr 20 07:50:44.506594 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:44.506558 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" event={"ID":"ea6db407-9937-4b0f-84e4-91f5c10786a5","Type":"ContainerStarted","Data":"7362a3620715a09f2e2de8f6637117d2efc02e41b00cbba1a7d75748daefb894"} Apr 20 07:50:44.527948 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:44.527803 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gs6zc" podStartSLOduration=3.957750984 podStartE2EDuration="37.527786805s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:08.573159269 +0000 UTC m=+1.869732231" lastFinishedPulling="2026-04-20 07:50:42.143195076 +0000 UTC m=+35.439768052" observedRunningTime="2026-04-20 07:50:44.526732582 +0000 UTC m=+37.823305566" watchObservedRunningTime="2026-04-20 07:50:44.527786805 +0000 UTC m=+37.824359790" Apr 20 07:50:45.509269 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:45.509240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vmv56" event={"ID":"0a4dfd92-0a59-4e3f-bb86-c3a74ffec631","Type":"ContainerStarted","Data":"02cae0ecd92c13d5079c1ed4876fa8fa8b3bef0527409bf8e964f591c60bde41"} Apr 20 07:50:45.509597 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:45.509363 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:50:45.522771 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:45.522731 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vmv56" podStartSLOduration=35.256756263 podStartE2EDuration="38.522719411s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:42.119331427 +0000 UTC m=+35.415904396" lastFinishedPulling="2026-04-20 07:50:45.38529456 +0000 UTC m=+38.681867544" observedRunningTime="2026-04-20 07:50:45.521584422 +0000 UTC m=+38.818157406" watchObservedRunningTime="2026-04-20 07:50:45.522719411 +0000 UTC m=+38.819292395" Apr 20 07:50:46.384893 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:46.384851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:46.385072 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:46.384986 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:46.385072 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:46.385055 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:54.385038588 +0000 UTC m=+47.681611551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:50:46.485621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:46.485587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:46.485774 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:46.485727 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:46.485821 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:46.485783 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:54.485767575 +0000 UTC m=+47.782340539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:50:48.699932 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:48.699886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:48.703294 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:48.703275 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d06eb14c-741b-46bf-aada-fd390434ddfd-original-pull-secret\") pod \"global-pull-secret-syncer-r4n6p\" (UID: \"d06eb14c-741b-46bf-aada-fd390434ddfd\") " pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:48.888936 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:48.888898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r4n6p" Apr 20 07:50:48.997228 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:48.997184 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r4n6p"] Apr 20 07:50:49.000626 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:49.000601 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06eb14c_741b_46bf_aada_fd390434ddfd.slice/crio-8b4e5f1306063d72c1322c6545c8b29579bc64610e50cc6222781816fc39b91f WatchSource:0}: Error finding container 8b4e5f1306063d72c1322c6545c8b29579bc64610e50cc6222781816fc39b91f: Status 404 returned error can't find the container with id 8b4e5f1306063d72c1322c6545c8b29579bc64610e50cc6222781816fc39b91f Apr 20 07:50:49.342693 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.342664 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp"] Apr 20 07:50:49.382678 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.382651 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8"] Apr 20 07:50:49.382823 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.382786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.384857 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.384834 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 07:50:49.385145 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.385126 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 07:50:49.385269 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.385127 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 07:50:49.385269 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.385159 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 07:50:49.385352 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.385277 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4sp2w\"" Apr 20 07:50:49.400312 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.400280 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp"] Apr 20 07:50:49.400312 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.400299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.400469 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.400328 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8"] Apr 20 07:50:49.402374 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.402353 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 07:50:49.402480 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.402354 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 07:50:49.402480 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.402402 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 07:50:49.402480 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.402402 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 07:50:49.504916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.504886 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-hub\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.504916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.504917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xq7\" (UniqueName: \"kubernetes.io/projected/3bf59936-0a81-4ee4-9a0b-20c6903ce458-kube-api-access-d9xq7\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.505072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.504936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w227j\" (UniqueName: \"kubernetes.io/projected/67e7a84b-83ee-4bca-8972-e2dd70f4dddc-kube-api-access-w227j\") pod \"managed-serviceaccount-addon-agent-548cc44b99-q6bnp\" (UID: \"67e7a84b-83ee-4bca-8972-e2dd70f4dddc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.505072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.504991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.505072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.505043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-ca\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.505072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.505068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67e7a84b-83ee-4bca-8972-e2dd70f4dddc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-548cc44b99-q6bnp\" (UID: \"67e7a84b-83ee-4bca-8972-e2dd70f4dddc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.505189 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.505097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.505189 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.505113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3bf59936-0a81-4ee4-9a0b-20c6903ce458-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.517258 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.517232 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r4n6p" event={"ID":"d06eb14c-741b-46bf-aada-fd390434ddfd","Type":"ContainerStarted","Data":"8b4e5f1306063d72c1322c6545c8b29579bc64610e50cc6222781816fc39b91f"} Apr 20 07:50:49.605835 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-hub\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.605835 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xq7\" (UniqueName: \"kubernetes.io/projected/3bf59936-0a81-4ee4-9a0b-20c6903ce458-kube-api-access-d9xq7\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.605835 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w227j\" (UniqueName: \"kubernetes.io/projected/67e7a84b-83ee-4bca-8972-e2dd70f4dddc-kube-api-access-w227j\") pod \"managed-serviceaccount-addon-agent-548cc44b99-q6bnp\" (UID: \"67e7a84b-83ee-4bca-8972-e2dd70f4dddc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.606078 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605855 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.606078 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-ca\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.606078 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605910 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67e7a84b-83ee-4bca-8972-e2dd70f4dddc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-548cc44b99-q6bnp\" (UID: \"67e7a84b-83ee-4bca-8972-e2dd70f4dddc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.606078 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.606078 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.605965 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3bf59936-0a81-4ee4-9a0b-20c6903ce458-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.608387 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.608357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67e7a84b-83ee-4bca-8972-e2dd70f4dddc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-548cc44b99-q6bnp\" (UID: \"67e7a84b-83ee-4bca-8972-e2dd70f4dddc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.613362 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.613338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3bf59936-0a81-4ee4-9a0b-20c6903ce458-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.613912 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.613883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w227j\" (UniqueName: \"kubernetes.io/projected/67e7a84b-83ee-4bca-8972-e2dd70f4dddc-kube-api-access-w227j\") pod \"managed-serviceaccount-addon-agent-548cc44b99-q6bnp\" (UID: \"67e7a84b-83ee-4bca-8972-e2dd70f4dddc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.615065 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.615035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.615194 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.615177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.615312 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.615286 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xq7\" (UniqueName: \"kubernetes.io/projected/3bf59936-0a81-4ee4-9a0b-20c6903ce458-kube-api-access-d9xq7\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.615453 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.615436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-ca\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.615505 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.615452 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3bf59936-0a81-4ee4-9a0b-20c6903ce458-hub\") pod \"cluster-proxy-proxy-agent-585fc5c8cd-fsdj8\" (UID: \"3bf59936-0a81-4ee4-9a0b-20c6903ce458\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.702053 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.702022 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" Apr 20 07:50:49.708089 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.707800 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:50:49.842257 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.842181 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp"] Apr 20 07:50:49.847880 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:49.847848 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e7a84b_83ee_4bca_8972_e2dd70f4dddc.slice/crio-cc593a0b24d8db6dd49a5e2e2972e3d8e6473d0e313ad78e2e609498f79472f2 WatchSource:0}: Error finding container cc593a0b24d8db6dd49a5e2e2972e3d8e6473d0e313ad78e2e609498f79472f2: Status 404 returned error can't find the container with id cc593a0b24d8db6dd49a5e2e2972e3d8e6473d0e313ad78e2e609498f79472f2 Apr 20 07:50:49.874002 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:49.873935 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8"] Apr 20 07:50:49.877661 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:50:49.877632 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf59936_0a81_4ee4_9a0b_20c6903ce458.slice/crio-daad99410b5efff0cf4faf79b911539da645f64186a5b4c05d668ac97828bbf9 WatchSource:0}: Error finding container daad99410b5efff0cf4faf79b911539da645f64186a5b4c05d668ac97828bbf9: Status 404 returned error can't find the container with id daad99410b5efff0cf4faf79b911539da645f64186a5b4c05d668ac97828bbf9 Apr 20 07:50:50.520787 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:50.520743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" event={"ID":"67e7a84b-83ee-4bca-8972-e2dd70f4dddc","Type":"ContainerStarted","Data":"cc593a0b24d8db6dd49a5e2e2972e3d8e6473d0e313ad78e2e609498f79472f2"} Apr 20 07:50:50.522039 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:50.521997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" event={"ID":"3bf59936-0a81-4ee4-9a0b-20c6903ce458","Type":"ContainerStarted","Data":"daad99410b5efff0cf4faf79b911539da645f64186a5b4c05d668ac97828bbf9"} Apr 20 07:50:54.446193 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:54.446146 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:50:54.446660 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:54.446308 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:54.446660 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:54.446390 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:10.44636831 +0000 UTC m=+63.742941285 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:50:54.547199 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:54.547154 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:50:54.547389 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:54.547340 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:54.547460 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:50:54.547429 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:10.547410169 +0000 UTC m=+63.843983142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:50:55.533861 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:55.533824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" event={"ID":"3bf59936-0a81-4ee4-9a0b-20c6903ce458","Type":"ContainerStarted","Data":"9acec87ea2b1d11ca628a0059182c4a0b8dbeac97d7d4d31b683e88f651d7976"} Apr 20 07:50:55.534964 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:55.534943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r4n6p" event={"ID":"d06eb14c-741b-46bf-aada-fd390434ddfd","Type":"ContainerStarted","Data":"dd178d9c7384341b90da8d1705f61ece0591322503cea4a33b4616b407680493"} Apr 20 07:50:55.536267 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:55.536243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" event={"ID":"67e7a84b-83ee-4bca-8972-e2dd70f4dddc","Type":"ContainerStarted","Data":"f6a84d087a396961bcaf60f4e567952bdf561f8de923b203cdee4d9b0eaf0975"} Apr 20 07:50:55.548618 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:55.548575 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-r4n6p" podStartSLOduration=33.61368778 podStartE2EDuration="39.548563174s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:49.002325676 +0000 UTC m=+42.298898638" lastFinishedPulling="2026-04-20 07:50:54.937201058 +0000 UTC m=+48.233774032" observedRunningTime="2026-04-20 07:50:55.547897041 +0000 UTC m=+48.844470024" watchObservedRunningTime="2026-04-20 07:50:55.548563174 +0000 UTC m=+48.845136209" Apr 20 07:50:55.560712 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:55.560674 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" podStartSLOduration=1.468018487 podStartE2EDuration="6.560663169s" podCreationTimestamp="2026-04-20 07:50:49 +0000 UTC" firstStartedPulling="2026-04-20 07:50:49.850751172 +0000 UTC m=+43.147324134" lastFinishedPulling="2026-04-20 07:50:54.943395855 +0000 UTC m=+48.239968816" observedRunningTime="2026-04-20 07:50:55.559854853 +0000 UTC m=+48.856427837" watchObservedRunningTime="2026-04-20 07:50:55.560663169 +0000 UTC m=+48.857236152" Apr 20 07:50:57.542883 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:57.542848 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" event={"ID":"3bf59936-0a81-4ee4-9a0b-20c6903ce458","Type":"ContainerStarted","Data":"38f41ef2af7e226c7d38aeee69e2e883845960ef2e9581f05646db43b7931667"} Apr 20 07:50:57.542883 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:57.542887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" event={"ID":"3bf59936-0a81-4ee4-9a0b-20c6903ce458","Type":"ContainerStarted","Data":"0c31d44361a79fb6874e3729ec8f2fe4160f7b84b3fbd736a2c289a03daeb1ba"} Apr 20 07:50:57.558658 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:50:57.558614 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" podStartSLOduration=1.563713396 podStartE2EDuration="8.558600546s" podCreationTimestamp="2026-04-20 07:50:49 +0000 UTC" firstStartedPulling="2026-04-20 07:50:49.879683411 +0000 UTC m=+43.176256378" lastFinishedPulling="2026-04-20 07:50:56.874570563 +0000 UTC m=+50.171143528" observedRunningTime="2026-04-20 07:50:57.55744048 +0000 UTC m=+50.854013464" watchObservedRunningTime="2026-04-20 07:50:57.558600546 +0000 UTC m=+50.855173530" Apr 20 07:51:04.484245 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:04.484203 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqpbd" Apr 20 07:51:10.461141 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:10.461097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:51:10.461639 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:10.461255 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:10.461639 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:10.461326 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:42.461309305 +0000 UTC m=+95.757882266 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:51:10.562223 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:10.562183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:51:10.562372 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:10.562322 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:10.562410 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:10.562394 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:42.562378707 +0000 UTC m=+95.858951668 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:51:11.973251 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:11.973194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:51:11.973635 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:11.973341 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:51:11.973635 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:11.973404 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:15.973387694 +0000 UTC m=+129.269960656 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : secret "metrics-daemon-secret" not found Apr 20 07:51:16.513234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:16.513182 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vmv56" Apr 20 07:51:42.483584 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:42.483480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:51:42.483996 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:42.483634 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:42.483996 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:42.483709 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls podName:19f51839-0090-41a8-b3ef-00a1ee0ca874 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:46.483691877 +0000 UTC m=+159.780264838 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls") pod "dns-default-6pmzx" (UID: "19f51839-0090-41a8-b3ef-00a1ee0ca874") : secret "dns-default-metrics-tls" not found Apr 20 07:51:42.583971 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:51:42.583939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:51:42.584105 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:42.584075 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:42.584168 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:51:42.584158 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert podName:5ba51e49-9c17-47c6-813d-05581eece4d6 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:46.584143162 +0000 UTC m=+159.880716124 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert") pod "ingress-canary-gg5hx" (UID: "5ba51e49-9c17-47c6-813d-05581eece4d6") : secret "canary-serving-cert" not found Apr 20 07:52:16.013258 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:16.013219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:52:16.013742 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:52:16.013357 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:52:16.013742 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:52:16.013436 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs podName:0e96090b-285a-4c1b-98c7-6793626b3969 nodeName:}" failed. No retries permitted until 2026-04-20 07:54:18.013421095 +0000 UTC m=+251.309994057 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs") pod "network-metrics-daemon-m5qfv" (UID: "0e96090b-285a-4c1b-98c7-6793626b3969") : secret "metrics-daemon-secret" not found Apr 20 07:52:16.118980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:16.118951 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jd8vj_8c2b7af6-350a-4223-8075-1a8760e67c96/dns-node-resolver/0.log" Apr 20 07:52:16.918827 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:16.918803 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-khs8v_f5d50f48-c1cd-490d-8f78-48a66378ab3a/node-ca/0.log" Apr 20 07:52:41.646397 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:52:41.646352 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6pmzx" podUID="19f51839-0090-41a8-b3ef-00a1ee0ca874" Apr 20 07:52:41.661646 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:52:41.661603 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gg5hx" podUID="5ba51e49-9c17-47c6-813d-05581eece4d6" Apr 20 07:52:41.786952 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:41.786924 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6pmzx" Apr 20 07:52:42.301369 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:52:42.301332 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-m5qfv" podUID="0e96090b-285a-4c1b-98c7-6793626b3969" Apr 20 07:52:46.525690 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.525651 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:52:46.527990 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.527969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/19f51839-0090-41a8-b3ef-00a1ee0ca874-metrics-tls\") pod \"dns-default-6pmzx\" (UID: \"19f51839-0090-41a8-b3ef-00a1ee0ca874\") " pod="openshift-dns/dns-default-6pmzx" Apr 20 07:52:46.589684 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.589657 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvf6t\"" Apr 20 07:52:46.598083 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.598063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6pmzx" Apr 20 07:52:46.626333 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.626300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:52:46.628567 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.628542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba51e49-9c17-47c6-813d-05581eece4d6-cert\") pod \"ingress-canary-gg5hx\" (UID: \"5ba51e49-9c17-47c6-813d-05581eece4d6\") " pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:52:46.709095 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.709064 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6pmzx"] Apr 20 07:52:46.712720 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:46.712690 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f51839_0090_41a8_b3ef_00a1ee0ca874.slice/crio-b15c6fae1cccefa5244c341416ba4563f0b6c43dfc3c40c21888a13594a117a6 WatchSource:0}: Error finding container b15c6fae1cccefa5244c341416ba4563f0b6c43dfc3c40c21888a13594a117a6: Status 404 returned error can't find the container with id b15c6fae1cccefa5244c341416ba4563f0b6c43dfc3c40c21888a13594a117a6 Apr 20 07:52:46.798421 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:46.798349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6pmzx" event={"ID":"19f51839-0090-41a8-b3ef-00a1ee0ca874","Type":"ContainerStarted","Data":"b15c6fae1cccefa5244c341416ba4563f0b6c43dfc3c40c21888a13594a117a6"} Apr 20 07:52:47.138173 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.138080 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-znzxf"] Apr 20 07:52:47.141955 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.141932 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.143926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.143903 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 07:52:47.144179 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.144161 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 07:52:47.144287 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.144248 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 07:52:47.144579 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.144549 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lvs8m\"" Apr 20 07:52:47.144692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.144584 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 07:52:47.153913 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.153291 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-znzxf"] Apr 20 07:52:47.232201 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.232164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.232201 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.232203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9sk\" (UniqueName: \"kubernetes.io/projected/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-kube-api-access-nl9sk\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.232464 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.232243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-crio-socket\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.232464 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.232296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-data-volume\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.232464 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.232368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333321 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333278 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-data-volume\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333321 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333320 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333644 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333644 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9sk\" (UniqueName: \"kubernetes.io/projected/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-kube-api-access-nl9sk\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333644 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-crio-socket\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333644 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-crio-socket\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.333864 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.333675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-data-volume\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.334027 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.334007 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.335924 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.335902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.342511 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.342483 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9sk\" (UniqueName: \"kubernetes.io/projected/deea7fc6-8ab1-4fa5-bc3e-7464e89e4318-kube-api-access-nl9sk\") pod \"insights-runtime-extractor-znzxf\" (UID: \"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318\") " pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.458020 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.457939 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-znzxf" Apr 20 07:52:47.596479 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:47.596450 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-znzxf"] Apr 20 07:52:47.950324 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:47.950291 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeea7fc6_8ab1_4fa5_bc3e_7464e89e4318.slice/crio-5fa9dcd13b46630d27d340511996468d7823ba2a9aa614182c4e7e56d6e4334a WatchSource:0}: Error finding container 5fa9dcd13b46630d27d340511996468d7823ba2a9aa614182c4e7e56d6e4334a: Status 404 returned error can't find the container with id 5fa9dcd13b46630d27d340511996468d7823ba2a9aa614182c4e7e56d6e4334a Apr 20 07:52:48.804712 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.804625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znzxf" event={"ID":"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318","Type":"ContainerStarted","Data":"7dbd6d7c81e36402c6bd89c06411611451caf3e28b94ff3805b34c15a8a3c4d4"} Apr 20 07:52:48.804712 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.804666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znzxf" event={"ID":"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318","Type":"ContainerStarted","Data":"f5e7decb9d6af58361d5aa6e2f1112e19dbffbcfdfd112962a95d9c527d1cf96"} Apr 20 07:52:48.804712 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.804679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znzxf" event={"ID":"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318","Type":"ContainerStarted","Data":"5fa9dcd13b46630d27d340511996468d7823ba2a9aa614182c4e7e56d6e4334a"} Apr 20 07:52:48.806133 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.806111 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6pmzx" event={"ID":"19f51839-0090-41a8-b3ef-00a1ee0ca874","Type":"ContainerStarted","Data":"1ff0122da359b847b4802b7bb0aaa12d4777b2978a03f83bd699e1241629421a"} Apr 20 07:52:48.806235 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.806138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6pmzx" event={"ID":"19f51839-0090-41a8-b3ef-00a1ee0ca874","Type":"ContainerStarted","Data":"972585cfd961cb77761661d8e163ab5f57a78c5342ee7d25d8ba4be2e3a4663d"} Apr 20 07:52:48.806293 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.806262 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6pmzx" Apr 20 07:52:48.820745 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:48.820706 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6pmzx" podStartSLOduration=129.538506689 podStartE2EDuration="2m10.820693893s" podCreationTimestamp="2026-04-20 07:50:38 +0000 UTC" firstStartedPulling="2026-04-20 07:52:46.714418432 +0000 UTC m=+160.010991393" lastFinishedPulling="2026-04-20 07:52:47.99660562 +0000 UTC m=+161.293178597" observedRunningTime="2026-04-20 07:52:48.819802361 +0000 UTC m=+162.116375346" watchObservedRunningTime="2026-04-20 07:52:48.820693893 +0000 UTC m=+162.117266930" Apr 20 07:52:50.813707 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:50.813661 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znzxf" event={"ID":"deea7fc6-8ab1-4fa5-bc3e-7464e89e4318","Type":"ContainerStarted","Data":"0c84a2fce37b504ecc5a02a91db4bbbabf00673c8b191bd8d88bf238a3a6a481"} Apr 20 07:52:50.833189 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:50.833141 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-znzxf" podStartSLOduration=1.9744972600000001 podStartE2EDuration="3.83312749s" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="2026-04-20 07:52:47.999957929 +0000 UTC m=+161.296530892" lastFinishedPulling="2026-04-20 07:52:49.858588153 +0000 UTC m=+163.155161122" observedRunningTime="2026-04-20 07:52:50.832434831 +0000 UTC m=+164.129007816" watchObservedRunningTime="2026-04-20 07:52:50.83312749 +0000 UTC m=+164.129700509" Apr 20 07:52:52.276310 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:52.276265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:52:52.278396 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:52.278375 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mxq77\"" Apr 20 07:52:52.286755 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:52.286732 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gg5hx" Apr 20 07:52:52.397354 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:52.397325 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gg5hx"] Apr 20 07:52:52.400467 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:52.400441 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba51e49_9c17_47c6_813d_05581eece4d6.slice/crio-fc8f98c868a6edfbf6cab5238f522c4e16649f988ebc80d1124a1ad8c92ad7fa WatchSource:0}: Error finding container fc8f98c868a6edfbf6cab5238f522c4e16649f988ebc80d1124a1ad8c92ad7fa: Status 404 returned error can't find the container with id fc8f98c868a6edfbf6cab5238f522c4e16649f988ebc80d1124a1ad8c92ad7fa Apr 20 07:52:52.819965 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:52.819924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gg5hx" event={"ID":"5ba51e49-9c17-47c6-813d-05581eece4d6","Type":"ContainerStarted","Data":"fc8f98c868a6edfbf6cab5238f522c4e16649f988ebc80d1124a1ad8c92ad7fa"} Apr 20 07:52:54.826383 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:54.826351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gg5hx" event={"ID":"5ba51e49-9c17-47c6-813d-05581eece4d6","Type":"ContainerStarted","Data":"57b3d0da83dab4e5cad21aa80a26964e9bdcc16064c2d4c4d17bb8f945b9f3ce"} Apr 20 07:52:54.851902 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:54.851737 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gg5hx" podStartSLOduration=135.381059065 podStartE2EDuration="2m16.851718997s" podCreationTimestamp="2026-04-20 07:50:38 +0000 UTC" firstStartedPulling="2026-04-20 07:52:52.402385078 +0000 UTC m=+165.698958040" lastFinishedPulling="2026-04-20 07:52:53.873044996 +0000 UTC m=+167.169617972" observedRunningTime="2026-04-20 07:52:54.850382427 +0000 UTC m=+168.146955410" watchObservedRunningTime="2026-04-20 07:52:54.851718997 +0000 UTC m=+168.148291983" Apr 20 07:52:55.081344 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.081268 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r9288"] Apr 20 07:52:55.083967 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.083951 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.085846 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.085826 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 07:52:55.085927 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.085826 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 07:52:55.086341 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.086321 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 07:52:55.086421 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.086349 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 07:52:55.086489 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.086436 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 07:52:55.086489 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.086434 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8t8pn\"" Apr 20 07:52:55.093309 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.093290 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r9288"] Apr 20 07:52:55.102463 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.102443 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kwntf"] Apr 20 07:52:55.105846 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.105829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.107857 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.107839 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 07:52:55.107857 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.107853 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-pk6pl\"" Apr 20 07:52:55.107994 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.107894 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 07:52:55.107994 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.107922 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 07:52:55.115576 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.115556 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kwntf"] Apr 20 07:52:55.123377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.123355 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qjwjd"] Apr 20 07:52:55.126464 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.126446 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.128339 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.128319 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 07:52:55.128433 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.128354 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 07:52:55.128489 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.128445 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-59mhk\"" Apr 20 07:52:55.128537 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.128486 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 07:52:55.190882 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.190848 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-sys\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191075 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.190887 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-metrics-client-ca\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191075 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.190956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/590627fc-8f17-4708-806d-6d1aaa587b47-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.191075 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.190995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-textfile\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191075 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.191075 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-root\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/590627fc-8f17-4708-806d-6d1aaa587b47-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191260 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-accelerators-collector-config\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191280 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fksd\" (UniqueName: \"kubernetes.io/projected/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-kube-api-access-6fksd\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191334 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191319 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-wtmp\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.191606 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191343 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.191606 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/590627fc-8f17-4708-806d-6d1aaa587b47-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.191606 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqkn\" (UniqueName: \"kubernetes.io/projected/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-api-access-ckqkn\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.191606 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bgh\" (UniqueName: \"kubernetes.io/projected/590627fc-8f17-4708-806d-6d1aaa587b47-kube-api-access-88bgh\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.191606 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.191409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-tls\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.276752 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.276720 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:52:55.291709 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291678 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fksd\" (UniqueName: \"kubernetes.io/projected/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-kube-api-access-6fksd\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.291845 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291716 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-wtmp\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.291845 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.291948 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-wtmp\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.291948 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291909 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/590627fc-8f17-4708-806d-6d1aaa587b47-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.291948 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqkn\" (UniqueName: \"kubernetes.io/projected/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-api-access-ckqkn\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.292098 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88bgh\" (UniqueName: \"kubernetes.io/projected/590627fc-8f17-4708-806d-6d1aaa587b47-kube-api-access-88bgh\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.292098 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.291974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-tls\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292098 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-sys\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292098 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-sys\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292098 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-metrics-client-ca\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292360 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/590627fc-8f17-4708-806d-6d1aaa587b47-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.292360 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292172 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-textfile\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292360 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.292360 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.292360 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292302 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-root\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292360 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.292639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/590627fc-8f17-4708-806d-6d1aaa587b47-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.292639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-accelerators-collector-config\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.292639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-textfile\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292881 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-metrics-client-ca\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.292881 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.292864 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-root\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.293128 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.293068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.293262 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.293160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/590627fc-8f17-4708-806d-6d1aaa587b47-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.293418 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.293390 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-accelerators-collector-config\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.293503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.293467 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.293503 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.293479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.294955 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.294927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-tls\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.295192 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.295166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/590627fc-8f17-4708-806d-6d1aaa587b47-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.295314 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.295244 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/590627fc-8f17-4708-806d-6d1aaa587b47-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.295434 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.295412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.295515 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.295501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.295724 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.295708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.303109 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.303088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bgh\" (UniqueName: \"kubernetes.io/projected/590627fc-8f17-4708-806d-6d1aaa587b47-kube-api-access-88bgh\") pod \"openshift-state-metrics-9d44df66c-r9288\" (UID: \"590627fc-8f17-4708-806d-6d1aaa587b47\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.303267 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.303242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fksd\" (UniqueName: \"kubernetes.io/projected/d1a29c4b-4ce9-483b-9569-26d9b1d19d6d-kube-api-access-6fksd\") pod \"node-exporter-qjwjd\" (UID: \"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d\") " pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.303438 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.303420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqkn\" (UniqueName: \"kubernetes.io/projected/7dacd9e0-2a6d-4b57-8ff9-7f04940233b2-kube-api-access-ckqkn\") pod \"kube-state-metrics-69db897b98-kwntf\" (UID: \"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.438460 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.438388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" Apr 20 07:52:55.443024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.443000 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" Apr 20 07:52:55.448623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.448600 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qjwjd" Apr 20 07:52:55.456294 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:55.456262 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a29c4b_4ce9_483b_9569_26d9b1d19d6d.slice/crio-4cacf79476a5fdae925cf1f3b632e0def30e029fbf01abfe3c43b67979ef58f2 WatchSource:0}: Error finding container 4cacf79476a5fdae925cf1f3b632e0def30e029fbf01abfe3c43b67979ef58f2: Status 404 returned error can't find the container with id 4cacf79476a5fdae925cf1f3b632e0def30e029fbf01abfe3c43b67979ef58f2 Apr 20 07:52:55.557702 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.557672 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r9288"] Apr 20 07:52:55.561115 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:55.561080 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590627fc_8f17_4708_806d_6d1aaa587b47.slice/crio-438f7d685e2f03eec54a094ed1b0ed823995c1c3cd2da9c98cd924fd7fef1cdb WatchSource:0}: Error finding container 438f7d685e2f03eec54a094ed1b0ed823995c1c3cd2da9c98cd924fd7fef1cdb: Status 404 returned error can't find the container with id 438f7d685e2f03eec54a094ed1b0ed823995c1c3cd2da9c98cd924fd7fef1cdb Apr 20 07:52:55.575220 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.575187 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kwntf"] Apr 20 07:52:55.578275 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:55.578247 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dacd9e0_2a6d_4b57_8ff9_7f04940233b2.slice/crio-cc47bf08e124c5afb03f253b7e64bb340ffd65ecc4dba09528388f3b920268d4 WatchSource:0}: Error finding container cc47bf08e124c5afb03f253b7e64bb340ffd65ecc4dba09528388f3b920268d4: Status 404 returned error can't find the container with id cc47bf08e124c5afb03f253b7e64bb340ffd65ecc4dba09528388f3b920268d4 Apr 20 07:52:55.830112 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.829980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" event={"ID":"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2","Type":"ContainerStarted","Data":"cc47bf08e124c5afb03f253b7e64bb340ffd65ecc4dba09528388f3b920268d4"} Apr 20 07:52:55.832046 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.832002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" event={"ID":"590627fc-8f17-4708-806d-6d1aaa587b47","Type":"ContainerStarted","Data":"0af6b7ce7584f803e24d6004d4012191670b165099d71342a00c7891bb55094d"} Apr 20 07:52:55.832046 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.832042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" event={"ID":"590627fc-8f17-4708-806d-6d1aaa587b47","Type":"ContainerStarted","Data":"4c79875d9b87d54a774af493a0c7f46f4a42960f3305b2e01af1e9cca86df376"} Apr 20 07:52:55.832253 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.832058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" event={"ID":"590627fc-8f17-4708-806d-6d1aaa587b47","Type":"ContainerStarted","Data":"438f7d685e2f03eec54a094ed1b0ed823995c1c3cd2da9c98cd924fd7fef1cdb"} Apr 20 07:52:55.833354 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.833330 2569 generic.go:358] "Generic (PLEG): container finished" podID="67e7a84b-83ee-4bca-8972-e2dd70f4dddc" containerID="f6a84d087a396961bcaf60f4e567952bdf561f8de923b203cdee4d9b0eaf0975" exitCode=255 Apr 20 07:52:55.833474 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.833405 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" event={"ID":"67e7a84b-83ee-4bca-8972-e2dd70f4dddc","Type":"ContainerDied","Data":"f6a84d087a396961bcaf60f4e567952bdf561f8de923b203cdee4d9b0eaf0975"} Apr 20 07:52:55.833781 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.833761 2569 scope.go:117] "RemoveContainer" containerID="f6a84d087a396961bcaf60f4e567952bdf561f8de923b203cdee4d9b0eaf0975" Apr 20 07:52:55.834884 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:55.834865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjwjd" event={"ID":"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d","Type":"ContainerStarted","Data":"4cacf79476a5fdae925cf1f3b632e0def30e029fbf01abfe3c43b67979ef58f2"} Apr 20 07:52:56.841245 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:56.841190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-548cc44b99-q6bnp" event={"ID":"67e7a84b-83ee-4bca-8972-e2dd70f4dddc","Type":"ContainerStarted","Data":"08a10fad4e989dc11de65b07887b9a0261a2b2a08971e5741f08e21e3cbc06e6"} Apr 20 07:52:56.842709 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:56.842683 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1a29c4b-4ce9-483b-9569-26d9b1d19d6d" containerID="6d4af911b647a68ed2d4490697a610a074d1a06188004daf4e52646f9758503e" exitCode=0 Apr 20 07:52:56.842840 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:56.842732 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjwjd" event={"ID":"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d","Type":"ContainerDied","Data":"6d4af911b647a68ed2d4490697a610a074d1a06188004daf4e52646f9758503e"} Apr 20 07:52:57.846972 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.846930 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjwjd" event={"ID":"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d","Type":"ContainerStarted","Data":"0c0fa29f84248b449a4a170c40afb27b8641e5585770ae6c61a5b23d8b14cbcd"} Apr 20 07:52:57.846972 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.846977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qjwjd" event={"ID":"d1a29c4b-4ce9-483b-9569-26d9b1d19d6d","Type":"ContainerStarted","Data":"e72c7b23c98211f258e2426c378d06386c60616e5a0392b77a7c8fbdc1a8a4ef"} Apr 20 07:52:57.848718 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.848695 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" event={"ID":"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2","Type":"ContainerStarted","Data":"926a8a2caf39560e86e2a8233cc786bb04846cca84f1b258088c04836ce1bbdf"} Apr 20 07:52:57.848770 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.848727 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" event={"ID":"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2","Type":"ContainerStarted","Data":"6b2e185c34ef60cc11997c53c09e17de22c2754d383aeb1b0ad6400323b1ec6c"} Apr 20 07:52:57.848770 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.848740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" event={"ID":"7dacd9e0-2a6d-4b57-8ff9-7f04940233b2","Type":"ContainerStarted","Data":"f1d9b6c55a1023f6c79ecc803d76b0a6c8d018ced9681f581cd4d065830298d8"} Apr 20 07:52:57.850443 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.850419 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" event={"ID":"590627fc-8f17-4708-806d-6d1aaa587b47","Type":"ContainerStarted","Data":"61476f0f12e2812ef18ee74b50e715533922ad69eb2793a5dd2b27009a16a104"} Apr 20 07:52:57.863319 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.863278 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qjwjd" podStartSLOduration=2.184870247 podStartE2EDuration="2.863265674s" podCreationTimestamp="2026-04-20 07:52:55 +0000 UTC" firstStartedPulling="2026-04-20 07:52:55.458377245 +0000 UTC m=+168.754950222" lastFinishedPulling="2026-04-20 07:52:56.136772683 +0000 UTC m=+169.433345649" observedRunningTime="2026-04-20 07:52:57.862307608 +0000 UTC m=+171.158880589" watchObservedRunningTime="2026-04-20 07:52:57.863265674 +0000 UTC m=+171.159838658" Apr 20 07:52:57.878316 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.878277 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r9288" podStartSLOduration=1.529426827 podStartE2EDuration="2.87826629s" podCreationTimestamp="2026-04-20 07:52:55 +0000 UTC" firstStartedPulling="2026-04-20 07:52:55.676474167 +0000 UTC m=+168.973047128" lastFinishedPulling="2026-04-20 07:52:57.025313613 +0000 UTC m=+170.321886591" observedRunningTime="2026-04-20 07:52:57.877143169 +0000 UTC m=+171.173716150" watchObservedRunningTime="2026-04-20 07:52:57.87826629 +0000 UTC m=+171.174839274" Apr 20 07:52:57.893977 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:57.893930 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-kwntf" podStartSLOduration=1.4504254730000001 podStartE2EDuration="2.893915773s" podCreationTimestamp="2026-04-20 07:52:55 +0000 UTC" firstStartedPulling="2026-04-20 07:52:55.580180499 +0000 UTC m=+168.876753476" lastFinishedPulling="2026-04-20 07:52:57.023670815 +0000 UTC m=+170.320243776" observedRunningTime="2026-04-20 07:52:57.892699674 +0000 UTC m=+171.189272658" watchObservedRunningTime="2026-04-20 07:52:57.893915773 +0000 UTC m=+171.190488759" Apr 20 07:52:58.811358 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:58.811320 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6pmzx" Apr 20 07:52:59.494191 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.494157 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5bc96c64f4-dxtfp"] Apr 20 07:52:59.497347 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.497330 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.499745 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.499716 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a64pf3koa5418\"" Apr 20 07:52:59.499858 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.499752 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 07:52:59.499858 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.499809 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-d8rn4\"" Apr 20 07:52:59.499858 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.499822 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 07:52:59.500004 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.499900 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 07:52:59.500004 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.499898 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 07:52:59.504612 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.504591 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bc96c64f4-dxtfp"] Apr 20 07:52:59.628326 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628283 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-secret-metrics-server-client-certs\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.628326 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74f750d2-e3e6-453d-ba47-5f700e14b402-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.628550 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pm4\" (UniqueName: \"kubernetes.io/projected/74f750d2-e3e6-453d-ba47-5f700e14b402-kube-api-access-96pm4\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.628550 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/74f750d2-e3e6-453d-ba47-5f700e14b402-metrics-server-audit-profiles\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.628550 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/74f750d2-e3e6-453d-ba47-5f700e14b402-audit-log\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.628550 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-secret-metrics-server-tls\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.628550 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.628535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-client-ca-bundle\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.729899 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.729864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-secret-metrics-server-client-certs\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.729899 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.729901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74f750d2-e3e6-453d-ba47-5f700e14b402-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.729929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96pm4\" (UniqueName: \"kubernetes.io/projected/74f750d2-e3e6-453d-ba47-5f700e14b402-kube-api-access-96pm4\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.729961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/74f750d2-e3e6-453d-ba47-5f700e14b402-metrics-server-audit-profiles\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.729988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/74f750d2-e3e6-453d-ba47-5f700e14b402-audit-log\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.730017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-secret-metrics-server-tls\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.730051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-client-ca-bundle\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730483 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.730454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/74f750d2-e3e6-453d-ba47-5f700e14b402-audit-log\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730761 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.730728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74f750d2-e3e6-453d-ba47-5f700e14b402-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.730973 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.730952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/74f750d2-e3e6-453d-ba47-5f700e14b402-metrics-server-audit-profiles\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.732529 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.732504 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-client-ca-bundle\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.732639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.732580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-secret-metrics-server-client-certs\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.732639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.732588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/74f750d2-e3e6-453d-ba47-5f700e14b402-secret-metrics-server-tls\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.736693 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.736672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pm4\" (UniqueName: \"kubernetes.io/projected/74f750d2-e3e6-453d-ba47-5f700e14b402-kube-api-access-96pm4\") pod \"metrics-server-5bc96c64f4-dxtfp\" (UID: \"74f750d2-e3e6-453d-ba47-5f700e14b402\") " pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.806991 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.806958 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:52:59.926784 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:52:59.926754 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bc96c64f4-dxtfp"] Apr 20 07:52:59.929686 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:52:59.929649 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f750d2_e3e6_453d_ba47_5f700e14b402.slice/crio-afe0815fa2caf86947ce622a6f712686a1c2c4469e7e5fe21fd21056ce799af1 WatchSource:0}: Error finding container afe0815fa2caf86947ce622a6f712686a1c2c4469e7e5fe21fd21056ce799af1: Status 404 returned error can't find the container with id afe0815fa2caf86947ce622a6f712686a1c2c4469e7e5fe21fd21056ce799af1 Apr 20 07:53:00.861275 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:00.861238 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" event={"ID":"74f750d2-e3e6-453d-ba47-5f700e14b402","Type":"ContainerStarted","Data":"afe0815fa2caf86947ce622a6f712686a1c2c4469e7e5fe21fd21056ce799af1"} Apr 20 07:53:01.865559 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:01.865523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" event={"ID":"74f750d2-e3e6-453d-ba47-5f700e14b402","Type":"ContainerStarted","Data":"abe7f228dffc1da2c5d804a5fbc7341971f02ffebb29b5d882e752ceb2a8535d"} Apr 20 07:53:01.880490 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:01.880443 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" podStartSLOduration=1.5566127669999998 podStartE2EDuration="2.880430801s" podCreationTimestamp="2026-04-20 07:52:59 +0000 UTC" firstStartedPulling="2026-04-20 07:52:59.931546054 +0000 UTC m=+173.228119016" lastFinishedPulling="2026-04-20 07:53:01.255364089 +0000 UTC m=+174.551937050" observedRunningTime="2026-04-20 07:53:01.879527714 +0000 UTC m=+175.176100699" watchObservedRunningTime="2026-04-20 07:53:01.880430801 +0000 UTC m=+175.177003810" Apr 20 07:53:19.807692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:19.807651 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:53:19.807692 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:19.807697 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:53:29.709421 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:29.709356 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" podUID="3bf59936-0a81-4ee4-9a0b-20c6903ce458" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 07:53:39.708975 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:39.708931 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" podUID="3bf59936-0a81-4ee4-9a0b-20c6903ce458" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 07:53:39.813333 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:39.813303 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:53:39.817118 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:39.817097 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5bc96c64f4-dxtfp" Apr 20 07:53:49.709522 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.709479 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" podUID="3bf59936-0a81-4ee4-9a0b-20c6903ce458" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 07:53:49.710018 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.709564 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" Apr 20 07:53:49.710237 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.710179 2569 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"38f41ef2af7e226c7d38aeee69e2e883845960ef2e9581f05646db43b7931667"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 07:53:49.710321 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.710278 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" podUID="3bf59936-0a81-4ee4-9a0b-20c6903ce458" containerName="service-proxy" containerID="cri-o://38f41ef2af7e226c7d38aeee69e2e883845960ef2e9581f05646db43b7931667" gracePeriod=30 Apr 20 07:53:49.990262 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.990164 2569 generic.go:358] "Generic (PLEG): container finished" podID="3bf59936-0a81-4ee4-9a0b-20c6903ce458" containerID="38f41ef2af7e226c7d38aeee69e2e883845960ef2e9581f05646db43b7931667" exitCode=2 Apr 20 07:53:49.990262 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.990240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" event={"ID":"3bf59936-0a81-4ee4-9a0b-20c6903ce458","Type":"ContainerDied","Data":"38f41ef2af7e226c7d38aeee69e2e883845960ef2e9581f05646db43b7931667"} Apr 20 07:53:49.990428 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:53:49.990276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-585fc5c8cd-fsdj8" event={"ID":"3bf59936-0a81-4ee4-9a0b-20c6903ce458","Type":"ContainerStarted","Data":"29dc4ffa0ff4305744865f5b0e9dd92f5b5c56367ce98b5dee605c3d4bd38b1c"} Apr 20 07:54:18.028480 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:18.028441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:54:18.030787 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:18.030762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e96090b-285a-4c1b-98c7-6793626b3969-metrics-certs\") pod \"network-metrics-daemon-m5qfv\" (UID: \"0e96090b-285a-4c1b-98c7-6793626b3969\") " pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:54:18.079709 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:18.079646 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2g69h\"" Apr 20 07:54:18.088153 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:18.088132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m5qfv" Apr 20 07:54:18.217959 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:18.217926 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m5qfv"] Apr 20 07:54:18.221415 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:54:18.221388 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e96090b_285a_4c1b_98c7_6793626b3969.slice/crio-67bd8f73fafac91bfddeac95d34500846c0415785c0c6e12f22d2fad76710936 WatchSource:0}: Error finding container 67bd8f73fafac91bfddeac95d34500846c0415785c0c6e12f22d2fad76710936: Status 404 returned error can't find the container with id 67bd8f73fafac91bfddeac95d34500846c0415785c0c6e12f22d2fad76710936 Apr 20 07:54:19.071464 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:19.071414 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m5qfv" event={"ID":"0e96090b-285a-4c1b-98c7-6793626b3969","Type":"ContainerStarted","Data":"67bd8f73fafac91bfddeac95d34500846c0415785c0c6e12f22d2fad76710936"} Apr 20 07:54:20.074967 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:20.074929 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m5qfv" event={"ID":"0e96090b-285a-4c1b-98c7-6793626b3969","Type":"ContainerStarted","Data":"a7e200b22f2362475e04f68eb6408b9dd71d82adf014f9c8725bff1c69a0f1cb"} Apr 20 07:54:20.074967 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:20.074966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m5qfv" event={"ID":"0e96090b-285a-4c1b-98c7-6793626b3969","Type":"ContainerStarted","Data":"24417944306f632f274c910e7208a9fe2b4fa217f2bb96a976933fbbeec96921"} Apr 20 07:54:20.089404 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:54:20.089361 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-m5qfv" podStartSLOduration=252.082967613 podStartE2EDuration="4m13.089346159s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:54:18.223008533 +0000 UTC m=+251.519581499" lastFinishedPulling="2026-04-20 07:54:19.229387084 +0000 UTC m=+252.525960045" observedRunningTime="2026-04-20 07:54:20.088521461 +0000 UTC m=+253.385094446" watchObservedRunningTime="2026-04-20 07:54:20.089346159 +0000 UTC m=+253.385919134" Apr 20 07:55:07.181120 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:55:07.181087 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 07:55:07.182643 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:55:07.182619 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 07:55:07.187665 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:55:07.187633 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 07:56:53.530840 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.530805 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw"] Apr 20 07:56:53.534013 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.533995 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.535944 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.535922 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bht96\"" Apr 20 07:56:53.536054 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.535969 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:56:53.536377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.536357 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:56:53.541984 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.541963 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw"] Apr 20 07:56:53.616491 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.616449 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.616491 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.616499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.616702 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.616630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fzhz\" (UniqueName: \"kubernetes.io/projected/ac2ed235-481f-4de0-87f1-1c81a21d7383-kube-api-access-8fzhz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.717588 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.717554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.717743 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.717620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fzhz\" (UniqueName: \"kubernetes.io/projected/ac2ed235-481f-4de0-87f1-1c81a21d7383-kube-api-access-8fzhz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.717743 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.717643 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.718004 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.717989 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.718046 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.717986 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.724752 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.724731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fzhz\" (UniqueName: \"kubernetes.io/projected/ac2ed235-481f-4de0-87f1-1c81a21d7383-kube-api-access-8fzhz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.843679 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.843592 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:56:53.966504 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.966481 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw"] Apr 20 07:56:53.969177 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:56:53.969131 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2ed235_481f_4de0_87f1_1c81a21d7383.slice/crio-09e9e3a2c027eede9d6c67d9f94e54843f3a40b81229b0c693c10727e6af4f41 WatchSource:0}: Error finding container 09e9e3a2c027eede9d6c67d9f94e54843f3a40b81229b0c693c10727e6af4f41: Status 404 returned error can't find the container with id 09e9e3a2c027eede9d6c67d9f94e54843f3a40b81229b0c693c10727e6af4f41 Apr 20 07:56:53.971423 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:53.971405 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:56:54.477679 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:56:54.477642 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" event={"ID":"ac2ed235-481f-4de0-87f1-1c81a21d7383","Type":"ContainerStarted","Data":"09e9e3a2c027eede9d6c67d9f94e54843f3a40b81229b0c693c10727e6af4f41"} Apr 20 07:57:00.495649 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:00.495610 2569 generic.go:358] "Generic (PLEG): container finished" podID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerID="7ff5f90d85bb02dc0fdc4451ee00c376abb991c786c8ba60185bb3f7edcd777a" exitCode=0 Apr 20 07:57:00.495649 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:00.495654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" event={"ID":"ac2ed235-481f-4de0-87f1-1c81a21d7383","Type":"ContainerDied","Data":"7ff5f90d85bb02dc0fdc4451ee00c376abb991c786c8ba60185bb3f7edcd777a"} Apr 20 07:57:02.504250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:02.504199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" event={"ID":"ac2ed235-481f-4de0-87f1-1c81a21d7383","Type":"ContainerStarted","Data":"1444035422fd7871feccb5eee49a1a83ce28b7e3b81b1d464701e05193dece9b"} Apr 20 07:57:03.508712 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:03.508677 2569 generic.go:358] "Generic (PLEG): container finished" podID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerID="1444035422fd7871feccb5eee49a1a83ce28b7e3b81b1d464701e05193dece9b" exitCode=0 Apr 20 07:57:03.509072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:03.508741 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" event={"ID":"ac2ed235-481f-4de0-87f1-1c81a21d7383","Type":"ContainerDied","Data":"1444035422fd7871feccb5eee49a1a83ce28b7e3b81b1d464701e05193dece9b"} Apr 20 07:57:10.531621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:10.531583 2569 generic.go:358] "Generic (PLEG): container finished" podID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerID="72c6035de583b248608b268160439eeb97d21858bbf0af8927cf9703f1ab9373" exitCode=0 Apr 20 07:57:10.532001 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:10.531657 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" event={"ID":"ac2ed235-481f-4de0-87f1-1c81a21d7383","Type":"ContainerDied","Data":"72c6035de583b248608b268160439eeb97d21858bbf0af8927cf9703f1ab9373"} Apr 20 07:57:11.649358 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.649336 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:57:11.767942 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.767910 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-bundle\") pod \"ac2ed235-481f-4de0-87f1-1c81a21d7383\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " Apr 20 07:57:11.767942 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.767941 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-util\") pod \"ac2ed235-481f-4de0-87f1-1c81a21d7383\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " Apr 20 07:57:11.768148 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.768010 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fzhz\" (UniqueName: \"kubernetes.io/projected/ac2ed235-481f-4de0-87f1-1c81a21d7383-kube-api-access-8fzhz\") pod \"ac2ed235-481f-4de0-87f1-1c81a21d7383\" (UID: \"ac2ed235-481f-4de0-87f1-1c81a21d7383\") " Apr 20 07:57:11.768502 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.768475 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-bundle" (OuterVolumeSpecName: "bundle") pod "ac2ed235-481f-4de0-87f1-1c81a21d7383" (UID: "ac2ed235-481f-4de0-87f1-1c81a21d7383"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:11.770263 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.770233 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2ed235-481f-4de0-87f1-1c81a21d7383-kube-api-access-8fzhz" (OuterVolumeSpecName: "kube-api-access-8fzhz") pod "ac2ed235-481f-4de0-87f1-1c81a21d7383" (UID: "ac2ed235-481f-4de0-87f1-1c81a21d7383"). InnerVolumeSpecName "kube-api-access-8fzhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:57:11.771910 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.771886 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-util" (OuterVolumeSpecName: "util") pod "ac2ed235-481f-4de0-87f1-1c81a21d7383" (UID: "ac2ed235-481f-4de0-87f1-1c81a21d7383"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:11.868825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.868725 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:11.868825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.868765 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac2ed235-481f-4de0-87f1-1c81a21d7383-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:11.868825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:11.868775 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8fzhz\" (UniqueName: \"kubernetes.io/projected/ac2ed235-481f-4de0-87f1-1c81a21d7383-kube-api-access-8fzhz\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:12.538834 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:12.538796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" event={"ID":"ac2ed235-481f-4de0-87f1-1c81a21d7383","Type":"ContainerDied","Data":"09e9e3a2c027eede9d6c67d9f94e54843f3a40b81229b0c693c10727e6af4f41"} Apr 20 07:57:12.538834 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:12.538833 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e9e3a2c027eede9d6c67d9f94e54843f3a40b81229b0c693c10727e6af4f41" Apr 20 07:57:12.539037 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:12.538831 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5q68sw" Apr 20 07:57:15.835449 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835406 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss"] Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835656 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="extract" Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835669 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="extract" Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835681 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="util" Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835688 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="util" Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835702 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="pull" Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835707 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="pull" Apr 20 07:57:15.836024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.835757 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac2ed235-481f-4de0-87f1-1c81a21d7383" containerName="extract" Apr 20 07:57:15.874882 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.874852 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss"] Apr 20 07:57:15.875035 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.874969 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:15.877186 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.877155 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:57:15.877335 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.877169 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 07:57:15.877335 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.877242 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-xrqk4\"" Apr 20 07:57:15.900284 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.900258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9db8\" (UniqueName: \"kubernetes.io/projected/27af17d0-b16c-4608-b3ae-7d54408b9d29-kube-api-access-r9db8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-cl4ss\" (UID: \"27af17d0-b16c-4608-b3ae-7d54408b9d29\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:15.900510 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:15.900288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27af17d0-b16c-4608-b3ae-7d54408b9d29-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-cl4ss\" (UID: \"27af17d0-b16c-4608-b3ae-7d54408b9d29\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:16.000788 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.000753 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9db8\" (UniqueName: \"kubernetes.io/projected/27af17d0-b16c-4608-b3ae-7d54408b9d29-kube-api-access-r9db8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-cl4ss\" (UID: \"27af17d0-b16c-4608-b3ae-7d54408b9d29\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:16.000931 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.000794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27af17d0-b16c-4608-b3ae-7d54408b9d29-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-cl4ss\" (UID: \"27af17d0-b16c-4608-b3ae-7d54408b9d29\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:16.001289 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.001270 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27af17d0-b16c-4608-b3ae-7d54408b9d29-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-cl4ss\" (UID: \"27af17d0-b16c-4608-b3ae-7d54408b9d29\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:16.010115 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.010097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9db8\" (UniqueName: \"kubernetes.io/projected/27af17d0-b16c-4608-b3ae-7d54408b9d29-kube-api-access-r9db8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-cl4ss\" (UID: \"27af17d0-b16c-4608-b3ae-7d54408b9d29\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:16.184614 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.184512 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" Apr 20 07:57:16.309087 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.308927 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss"] Apr 20 07:57:16.312771 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:16.312738 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27af17d0_b16c_4608_b3ae_7d54408b9d29.slice/crio-70c1e9e2347a4cbd1c84aaff278a8c1567f78a82f9dd2064a3eedc53d8307f89 WatchSource:0}: Error finding container 70c1e9e2347a4cbd1c84aaff278a8c1567f78a82f9dd2064a3eedc53d8307f89: Status 404 returned error can't find the container with id 70c1e9e2347a4cbd1c84aaff278a8c1567f78a82f9dd2064a3eedc53d8307f89 Apr 20 07:57:16.549385 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:16.549353 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" event={"ID":"27af17d0-b16c-4608-b3ae-7d54408b9d29","Type":"ContainerStarted","Data":"70c1e9e2347a4cbd1c84aaff278a8c1567f78a82f9dd2064a3eedc53d8307f89"} Apr 20 07:57:19.560527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:19.560489 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" event={"ID":"27af17d0-b16c-4608-b3ae-7d54408b9d29","Type":"ContainerStarted","Data":"3a0a9713e778dd0f59c183b71782932c49646e0acf7b2deee9d58157e701a1bd"} Apr 20 07:57:19.577570 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:19.577522 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-cl4ss" podStartSLOduration=2.417974155 podStartE2EDuration="4.577507087s" podCreationTimestamp="2026-04-20 07:57:15 +0000 UTC" firstStartedPulling="2026-04-20 07:57:16.315919138 +0000 UTC m=+429.612492099" lastFinishedPulling="2026-04-20 07:57:18.475452055 +0000 UTC m=+431.772025031" observedRunningTime="2026-04-20 07:57:19.575960108 +0000 UTC m=+432.872533093" watchObservedRunningTime="2026-04-20 07:57:19.577507087 +0000 UTC m=+432.874080070" Apr 20 07:57:21.078561 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.078525 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f"] Apr 20 07:57:21.094726 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.094696 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f"] Apr 20 07:57:21.094869 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.094809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.096681 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.096657 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:57:21.096802 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.096697 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bht96\"" Apr 20 07:57:21.096997 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.096981 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:57:21.132654 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.132626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.132793 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.132663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wt2\" (UniqueName: \"kubernetes.io/projected/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-kube-api-access-j5wt2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.132793 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.132712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.233697 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.233659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.233867 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.233726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.233867 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.233747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wt2\" (UniqueName: \"kubernetes.io/projected/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-kube-api-access-j5wt2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.234035 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.234009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.234089 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.234041 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.240932 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.240897 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wt2\" (UniqueName: \"kubernetes.io/projected/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-kube-api-access-j5wt2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.405542 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.405461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:21.520674 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.520642 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f"] Apr 20 07:57:21.523566 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:21.523535 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c13b07_d9d8_420c_9e58_22ee4418d7ee.slice/crio-89898e201bf39ec019f421ef3da8d008e99b62e001effeca57848a89fca29eef WatchSource:0}: Error finding container 89898e201bf39ec019f421ef3da8d008e99b62e001effeca57848a89fca29eef: Status 404 returned error can't find the container with id 89898e201bf39ec019f421ef3da8d008e99b62e001effeca57848a89fca29eef Apr 20 07:57:21.566527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:21.566500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" event={"ID":"a4c13b07-d9d8-420c-9e58-22ee4418d7ee","Type":"ContainerStarted","Data":"89898e201bf39ec019f421ef3da8d008e99b62e001effeca57848a89fca29eef"} Apr 20 07:57:22.571417 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:22.571384 2569 generic.go:358] "Generic (PLEG): container finished" podID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerID="f06a23077c01e3445e0481d1b8299316e541ccfccf8fd705750f1c448cbcdb1d" exitCode=0 Apr 20 07:57:22.571797 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:22.571456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" event={"ID":"a4c13b07-d9d8-420c-9e58-22ee4418d7ee","Type":"ContainerDied","Data":"f06a23077c01e3445e0481d1b8299316e541ccfccf8fd705750f1c448cbcdb1d"} Apr 20 07:57:23.354926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.354890 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nf7xj"] Apr 20 07:57:23.358330 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.358308 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.360350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.360322 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 07:57:23.360460 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.360364 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-s5jt8\"" Apr 20 07:57:23.360814 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.360796 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 07:57:23.364295 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.364274 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nf7xj"] Apr 20 07:57:23.451262 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.451229 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/586404d1-abba-45e2-9a6c-3c58c3942997-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nf7xj\" (UID: \"586404d1-abba-45e2-9a6c-3c58c3942997\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.451443 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.451287 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659xp\" (UniqueName: \"kubernetes.io/projected/586404d1-abba-45e2-9a6c-3c58c3942997-kube-api-access-659xp\") pod \"cert-manager-cainjector-8966b78d4-nf7xj\" (UID: \"586404d1-abba-45e2-9a6c-3c58c3942997\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.552760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.552666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/586404d1-abba-45e2-9a6c-3c58c3942997-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nf7xj\" (UID: \"586404d1-abba-45e2-9a6c-3c58c3942997\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.552760 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.552752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-659xp\" (UniqueName: \"kubernetes.io/projected/586404d1-abba-45e2-9a6c-3c58c3942997-kube-api-access-659xp\") pod \"cert-manager-cainjector-8966b78d4-nf7xj\" (UID: \"586404d1-abba-45e2-9a6c-3c58c3942997\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.562379 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.562351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/586404d1-abba-45e2-9a6c-3c58c3942997-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nf7xj\" (UID: \"586404d1-abba-45e2-9a6c-3c58c3942997\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.562691 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.562642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-659xp\" (UniqueName: \"kubernetes.io/projected/586404d1-abba-45e2-9a6c-3c58c3942997-kube-api-access-659xp\") pod \"cert-manager-cainjector-8966b78d4-nf7xj\" (UID: \"586404d1-abba-45e2-9a6c-3c58c3942997\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.670354 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.670270 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" Apr 20 07:57:23.806427 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:23.806397 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nf7xj"] Apr 20 07:57:23.809333 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:23.809301 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586404d1_abba_45e2_9a6c_3c58c3942997.slice/crio-acad246234b31cfaf24ea1e34cc47162933d79c22e60f209d2a0f8dc5f9d4af0 WatchSource:0}: Error finding container acad246234b31cfaf24ea1e34cc47162933d79c22e60f209d2a0f8dc5f9d4af0: Status 404 returned error can't find the container with id acad246234b31cfaf24ea1e34cc47162933d79c22e60f209d2a0f8dc5f9d4af0 Apr 20 07:57:24.581963 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:24.581926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" event={"ID":"586404d1-abba-45e2-9a6c-3c58c3942997","Type":"ContainerStarted","Data":"acad246234b31cfaf24ea1e34cc47162933d79c22e60f209d2a0f8dc5f9d4af0"} Apr 20 07:57:25.587595 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:25.587561 2569 generic.go:358] "Generic (PLEG): container finished" podID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerID="9a885edaf588b07c34dedff025896f3dbb250f58261852ff20df0cd2fa9fef32" exitCode=0 Apr 20 07:57:25.588032 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:25.587645 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" event={"ID":"a4c13b07-d9d8-420c-9e58-22ee4418d7ee","Type":"ContainerDied","Data":"9a885edaf588b07c34dedff025896f3dbb250f58261852ff20df0cd2fa9fef32"} Apr 20 07:57:26.593312 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:26.593273 2569 generic.go:358] "Generic (PLEG): container finished" podID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerID="35187a9505fa910ee5bc88e54751fa2ef6ff2add625485ff6e400c2355ee7d00" exitCode=0 Apr 20 07:57:26.593749 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:26.593330 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" event={"ID":"a4c13b07-d9d8-420c-9e58-22ee4418d7ee","Type":"ContainerDied","Data":"35187a9505fa910ee5bc88e54751fa2ef6ff2add625485ff6e400c2355ee7d00"} Apr 20 07:57:27.597550 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.597510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" event={"ID":"586404d1-abba-45e2-9a6c-3c58c3942997","Type":"ContainerStarted","Data":"74d36b993210311495073fcb46196be31aaf5afc195fcd7041861936f364355e"} Apr 20 07:57:27.613626 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.613567 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-nf7xj" podStartSLOduration=1.650635513 podStartE2EDuration="4.613548816s" podCreationTimestamp="2026-04-20 07:57:23 +0000 UTC" firstStartedPulling="2026-04-20 07:57:23.811521257 +0000 UTC m=+437.108094232" lastFinishedPulling="2026-04-20 07:57:26.774434572 +0000 UTC m=+440.071007535" observedRunningTime="2026-04-20 07:57:27.611385625 +0000 UTC m=+440.907958611" watchObservedRunningTime="2026-04-20 07:57:27.613548816 +0000 UTC m=+440.910121801" Apr 20 07:57:27.724408 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.724385 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:27.785987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.785957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5wt2\" (UniqueName: \"kubernetes.io/projected/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-kube-api-access-j5wt2\") pod \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " Apr 20 07:57:27.786144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.786000 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-util\") pod \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " Apr 20 07:57:27.786144 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.786037 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-bundle\") pod \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\" (UID: \"a4c13b07-d9d8-420c-9e58-22ee4418d7ee\") " Apr 20 07:57:27.786425 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.786394 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-bundle" (OuterVolumeSpecName: "bundle") pod "a4c13b07-d9d8-420c-9e58-22ee4418d7ee" (UID: "a4c13b07-d9d8-420c-9e58-22ee4418d7ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:27.787960 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.787938 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-kube-api-access-j5wt2" (OuterVolumeSpecName: "kube-api-access-j5wt2") pod "a4c13b07-d9d8-420c-9e58-22ee4418d7ee" (UID: "a4c13b07-d9d8-420c-9e58-22ee4418d7ee"). InnerVolumeSpecName "kube-api-access-j5wt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:57:27.790477 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.790449 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-util" (OuterVolumeSpecName: "util") pod "a4c13b07-d9d8-420c-9e58-22ee4418d7ee" (UID: "a4c13b07-d9d8-420c-9e58-22ee4418d7ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:27.887472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.887391 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:27.887472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.887417 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:27.887472 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:27.887428 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5wt2\" (UniqueName: \"kubernetes.io/projected/a4c13b07-d9d8-420c-9e58-22ee4418d7ee-kube-api-access-j5wt2\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:28.602571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:28.602537 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" Apr 20 07:57:28.602571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:28.602540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpsq7f" event={"ID":"a4c13b07-d9d8-420c-9e58-22ee4418d7ee","Type":"ContainerDied","Data":"89898e201bf39ec019f421ef3da8d008e99b62e001effeca57848a89fca29eef"} Apr 20 07:57:28.603024 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:28.602591 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89898e201bf39ec019f421ef3da8d008e99b62e001effeca57848a89fca29eef" Apr 20 07:57:40.323868 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.323827 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-5bwgj"] Apr 20 07:57:40.324350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324227 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="pull" Apr 20 07:57:40.324350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324258 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="pull" Apr 20 07:57:40.324350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324283 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="util" Apr 20 07:57:40.324350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324291 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="util" Apr 20 07:57:40.324350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324342 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="extract" Apr 20 07:57:40.324593 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324359 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="extract" Apr 20 07:57:40.324593 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.324488 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c13b07-d9d8-420c-9e58-22ee4418d7ee" containerName="extract" Apr 20 07:57:40.329340 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.329317 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-5bwgj"] Apr 20 07:57:40.329466 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.329399 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.331189 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.331166 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-77qtq\"" Apr 20 07:57:40.375511 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.375483 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7-bound-sa-token\") pod \"cert-manager-759f64656b-5bwgj\" (UID: \"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7\") " pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.375635 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.375536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78q99\" (UniqueName: \"kubernetes.io/projected/10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7-kube-api-access-78q99\") pod \"cert-manager-759f64656b-5bwgj\" (UID: \"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7\") " pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.476321 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.476285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78q99\" (UniqueName: \"kubernetes.io/projected/10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7-kube-api-access-78q99\") pod \"cert-manager-759f64656b-5bwgj\" (UID: \"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7\") " pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.476484 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.476347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7-bound-sa-token\") pod \"cert-manager-759f64656b-5bwgj\" (UID: \"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7\") " pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.483477 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.483454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7-bound-sa-token\") pod \"cert-manager-759f64656b-5bwgj\" (UID: \"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7\") " pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.483669 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.483651 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78q99\" (UniqueName: \"kubernetes.io/projected/10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7-kube-api-access-78q99\") pod \"cert-manager-759f64656b-5bwgj\" (UID: \"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7\") " pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.638103 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.638008 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-5bwgj" Apr 20 07:57:40.751264 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:40.751239 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-5bwgj"] Apr 20 07:57:40.753937 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:40.753909 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e8e2f4_1fcb_4193_8fd9_c05b5fc033c7.slice/crio-d8aab2bde36966ea7eb466a82910061ee454e9fc69d615740eb09832c10f3dde WatchSource:0}: Error finding container d8aab2bde36966ea7eb466a82910061ee454e9fc69d615740eb09832c10f3dde: Status 404 returned error can't find the container with id d8aab2bde36966ea7eb466a82910061ee454e9fc69d615740eb09832c10f3dde Apr 20 07:57:41.281270 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.281240 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd"] Apr 20 07:57:41.284737 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.284719 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.286627 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.286583 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:57:41.286627 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.286619 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bht96\"" Apr 20 07:57:41.287045 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.287031 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:57:41.289784 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.289765 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd"] Apr 20 07:57:41.382911 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.382879 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.383313 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.382921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.383313 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.382966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8vfs\" (UniqueName: \"kubernetes.io/projected/bf40407f-97a0-4f4b-b703-ecc8a6f916de-kube-api-access-w8vfs\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.483344 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.483316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.483484 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.483361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.483484 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.483399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8vfs\" (UniqueName: \"kubernetes.io/projected/bf40407f-97a0-4f4b-b703-ecc8a6f916de-kube-api-access-w8vfs\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.483771 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.483750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.483807 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.483760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.490728 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.490700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8vfs\" (UniqueName: \"kubernetes.io/projected/bf40407f-97a0-4f4b-b703-ecc8a6f916de-kube-api-access-w8vfs\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.594931 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.594831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:41.644319 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.644283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-5bwgj" event={"ID":"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7","Type":"ContainerStarted","Data":"35b86b7b2f7fa35a999c026dfb850871fdc9c25b04999253ed2319ae783f1785"} Apr 20 07:57:41.644470 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.644328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-5bwgj" event={"ID":"10e8e2f4-1fcb-4193-8fd9-c05b5fc033c7","Type":"ContainerStarted","Data":"d8aab2bde36966ea7eb466a82910061ee454e9fc69d615740eb09832c10f3dde"} Apr 20 07:57:41.658332 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.658290 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-5bwgj" podStartSLOduration=1.658275187 podStartE2EDuration="1.658275187s" podCreationTimestamp="2026-04-20 07:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:57:41.65804698 +0000 UTC m=+454.954619964" watchObservedRunningTime="2026-04-20 07:57:41.658275187 +0000 UTC m=+454.954848168" Apr 20 07:57:41.712234 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:41.712178 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd"] Apr 20 07:57:41.716488 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:41.716459 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf40407f_97a0_4f4b_b703_ecc8a6f916de.slice/crio-11ffd60457ac9a8e89f074e8368f482e47c20bc8d0f657cf8c2d666dafe0f42b WatchSource:0}: Error finding container 11ffd60457ac9a8e89f074e8368f482e47c20bc8d0f657cf8c2d666dafe0f42b: Status 404 returned error can't find the container with id 11ffd60457ac9a8e89f074e8368f482e47c20bc8d0f657cf8c2d666dafe0f42b Apr 20 07:57:42.648305 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:42.648271 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerID="c2b99b2c091e329538b2761ebc04c12725db881319d05b2e649040c36dfc8abf" exitCode=0 Apr 20 07:57:42.648800 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:42.648352 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" event={"ID":"bf40407f-97a0-4f4b-b703-ecc8a6f916de","Type":"ContainerDied","Data":"c2b99b2c091e329538b2761ebc04c12725db881319d05b2e649040c36dfc8abf"} Apr 20 07:57:42.648800 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:42.648382 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" event={"ID":"bf40407f-97a0-4f4b-b703-ecc8a6f916de","Type":"ContainerStarted","Data":"11ffd60457ac9a8e89f074e8368f482e47c20bc8d0f657cf8c2d666dafe0f42b"} Apr 20 07:57:43.653683 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:43.653650 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerID="d175258026d9219c8302fe500c8c7dee9994e2a1b84b53f152e406781fe964a3" exitCode=0 Apr 20 07:57:43.654067 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:43.653726 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" event={"ID":"bf40407f-97a0-4f4b-b703-ecc8a6f916de","Type":"ContainerDied","Data":"d175258026d9219c8302fe500c8c7dee9994e2a1b84b53f152e406781fe964a3"} Apr 20 07:57:44.659127 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:44.659096 2569 generic.go:358] "Generic (PLEG): container finished" podID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerID="780eb0ea328558a6d2e2a4b75f5ccc187b32d19b79f72371fb8071cda5d22919" exitCode=0 Apr 20 07:57:44.659492 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:44.659167 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" event={"ID":"bf40407f-97a0-4f4b-b703-ecc8a6f916de","Type":"ContainerDied","Data":"780eb0ea328558a6d2e2a4b75f5ccc187b32d19b79f72371fb8071cda5d22919"} Apr 20 07:57:45.781181 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.781160 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:45.815852 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.815827 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-util\") pod \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " Apr 20 07:57:45.815953 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.815866 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8vfs\" (UniqueName: \"kubernetes.io/projected/bf40407f-97a0-4f4b-b703-ecc8a6f916de-kube-api-access-w8vfs\") pod \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " Apr 20 07:57:45.815953 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.815907 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-bundle\") pod \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\" (UID: \"bf40407f-97a0-4f4b-b703-ecc8a6f916de\") " Apr 20 07:57:45.816795 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.816763 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-bundle" (OuterVolumeSpecName: "bundle") pod "bf40407f-97a0-4f4b-b703-ecc8a6f916de" (UID: "bf40407f-97a0-4f4b-b703-ecc8a6f916de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:45.817851 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.817826 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf40407f-97a0-4f4b-b703-ecc8a6f916de-kube-api-access-w8vfs" (OuterVolumeSpecName: "kube-api-access-w8vfs") pod "bf40407f-97a0-4f4b-b703-ecc8a6f916de" (UID: "bf40407f-97a0-4f4b-b703-ecc8a6f916de"). InnerVolumeSpecName "kube-api-access-w8vfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:57:45.821543 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.821513 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-util" (OuterVolumeSpecName: "util") pod "bf40407f-97a0-4f4b-b703-ecc8a6f916de" (UID: "bf40407f-97a0-4f4b-b703-ecc8a6f916de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:45.916891 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.916805 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:45.916891 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.916838 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8vfs\" (UniqueName: \"kubernetes.io/projected/bf40407f-97a0-4f4b-b703-ecc8a6f916de-kube-api-access-w8vfs\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:45.916891 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:45.916849 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf40407f-97a0-4f4b-b703-ecc8a6f916de-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:46.667112 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:46.667084 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" Apr 20 07:57:46.667380 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:46.667079 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wdctd" event={"ID":"bf40407f-97a0-4f4b-b703-ecc8a6f916de","Type":"ContainerDied","Data":"11ffd60457ac9a8e89f074e8368f482e47c20bc8d0f657cf8c2d666dafe0f42b"} Apr 20 07:57:46.667380 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:46.667189 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ffd60457ac9a8e89f074e8368f482e47c20bc8d0f657cf8c2d666dafe0f42b" Apr 20 07:57:51.495980 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.495943 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968"] Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496222 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="util" Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496235 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="util" Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496248 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="pull" Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496253 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="pull" Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496269 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="extract" Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496274 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="extract" Apr 20 07:57:51.496377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.496318 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf40407f-97a0-4f4b-b703-ecc8a6f916de" containerName="extract" Apr 20 07:57:51.499255 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.499236 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.501351 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.501329 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:57:51.501751 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.501715 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bht96\"" Apr 20 07:57:51.501751 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.501740 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:57:51.505250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.505227 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968"] Apr 20 07:57:51.563161 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.563131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.563350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.563193 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.563350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.563253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64l4\" (UniqueName: \"kubernetes.io/projected/91fccc6c-d0e9-4119-ac50-85572363aa9f-kube-api-access-t64l4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.664465 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.664429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.664465 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.664470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t64l4\" (UniqueName: \"kubernetes.io/projected/91fccc6c-d0e9-4119-ac50-85572363aa9f-kube-api-access-t64l4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.664674 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.664522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.664901 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.664879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.664939 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.664890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.680605 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.680571 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64l4\" (UniqueName: \"kubernetes.io/projected/91fccc6c-d0e9-4119-ac50-85572363aa9f-kube-api-access-t64l4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.809258 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.809224 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:51.930388 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:51.930352 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968"] Apr 20 07:57:51.933276 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:51.933237 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91fccc6c_d0e9_4119_ac50_85572363aa9f.slice/crio-c266f6b9dda7ff6384e16326d57f3be009b97151312efe764b610b8dabfa20d8 WatchSource:0}: Error finding container c266f6b9dda7ff6384e16326d57f3be009b97151312efe764b610b8dabfa20d8: Status 404 returned error can't find the container with id c266f6b9dda7ff6384e16326d57f3be009b97151312efe764b610b8dabfa20d8 Apr 20 07:57:52.686433 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:52.686402 2569 generic.go:358] "Generic (PLEG): container finished" podID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerID="70e14d5814cb0aa72723a99dc6fdaa00f72dfbff07c1dd6dec37fd53488ff783" exitCode=0 Apr 20 07:57:52.686772 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:52.686467 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" event={"ID":"91fccc6c-d0e9-4119-ac50-85572363aa9f","Type":"ContainerDied","Data":"70e14d5814cb0aa72723a99dc6fdaa00f72dfbff07c1dd6dec37fd53488ff783"} Apr 20 07:57:52.686772 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:52.686492 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" event={"ID":"91fccc6c-d0e9-4119-ac50-85572363aa9f","Type":"ContainerStarted","Data":"c266f6b9dda7ff6384e16326d57f3be009b97151312efe764b610b8dabfa20d8"} Apr 20 07:57:53.524683 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.524652 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb"] Apr 20 07:57:53.527688 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.527668 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.529681 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.529657 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 07:57:53.529797 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.529717 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 07:57:53.529797 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.529717 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 07:57:53.529797 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.529768 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 07:57:53.529797 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.529783 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9x2zw\"" Apr 20 07:57:53.541317 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.541296 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb"] Apr 20 07:57:53.580660 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.580627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.580660 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.580661 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45j5c\" (UniqueName: \"kubernetes.io/projected/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-kube-api-access-45j5c\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.580850 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.580703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.681759 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.681721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.681926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.681768 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45j5c\" (UniqueName: \"kubernetes.io/projected/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-kube-api-access-45j5c\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.681926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.681840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.684341 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.684307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.684448 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.684379 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.691081 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.689652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45j5c\" (UniqueName: \"kubernetes.io/projected/347d9aac-7ac6-44f2-b44e-dd9b37d26eab-kube-api-access-45j5c\") pod \"opendatahub-operator-controller-manager-687c889b9-q6blb\" (UID: \"347d9aac-7ac6-44f2-b44e-dd9b37d26eab\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.694808 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.694777 2569 generic.go:358] "Generic (PLEG): container finished" podID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerID="0e648442a79db55a154f81f24a614b2a23b7ffbcc77fbd5439e4c222e2555522" exitCode=0 Apr 20 07:57:53.694924 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.694808 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" event={"ID":"91fccc6c-d0e9-4119-ac50-85572363aa9f","Type":"ContainerDied","Data":"0e648442a79db55a154f81f24a614b2a23b7ffbcc77fbd5439e4c222e2555522"} Apr 20 07:57:53.838862 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.838834 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:53.958388 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:53.958355 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb"] Apr 20 07:57:53.961350 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:57:53.961320 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod347d9aac_7ac6_44f2_b44e_dd9b37d26eab.slice/crio-64fa4c83229bdbd8c3736a4b83be8ed9d42e53fc6156e2230ce2125f8da566c9 WatchSource:0}: Error finding container 64fa4c83229bdbd8c3736a4b83be8ed9d42e53fc6156e2230ce2125f8da566c9: Status 404 returned error can't find the container with id 64fa4c83229bdbd8c3736a4b83be8ed9d42e53fc6156e2230ce2125f8da566c9 Apr 20 07:57:54.702074 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:54.702036 2569 generic.go:358] "Generic (PLEG): container finished" podID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerID="0e145a4eaebffa1fa986f3a4dc3daf237fdd56461b1d599e682d8bf08dd5257e" exitCode=0 Apr 20 07:57:54.702538 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:54.702121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" event={"ID":"91fccc6c-d0e9-4119-ac50-85572363aa9f","Type":"ContainerDied","Data":"0e145a4eaebffa1fa986f3a4dc3daf237fdd56461b1d599e682d8bf08dd5257e"} Apr 20 07:57:54.703849 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:54.703810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" event={"ID":"347d9aac-7ac6-44f2-b44e-dd9b37d26eab","Type":"ContainerStarted","Data":"64fa4c83229bdbd8c3736a4b83be8ed9d42e53fc6156e2230ce2125f8da566c9"} Apr 20 07:57:56.318590 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.318570 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:56.403262 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.403235 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-bundle\") pod \"91fccc6c-d0e9-4119-ac50-85572363aa9f\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " Apr 20 07:57:56.403370 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.403302 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64l4\" (UniqueName: \"kubernetes.io/projected/91fccc6c-d0e9-4119-ac50-85572363aa9f-kube-api-access-t64l4\") pod \"91fccc6c-d0e9-4119-ac50-85572363aa9f\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " Apr 20 07:57:56.403370 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.403333 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-util\") pod \"91fccc6c-d0e9-4119-ac50-85572363aa9f\" (UID: \"91fccc6c-d0e9-4119-ac50-85572363aa9f\") " Apr 20 07:57:56.404070 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.404033 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-bundle" (OuterVolumeSpecName: "bundle") pod "91fccc6c-d0e9-4119-ac50-85572363aa9f" (UID: "91fccc6c-d0e9-4119-ac50-85572363aa9f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:56.405172 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.405148 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fccc6c-d0e9-4119-ac50-85572363aa9f-kube-api-access-t64l4" (OuterVolumeSpecName: "kube-api-access-t64l4") pod "91fccc6c-d0e9-4119-ac50-85572363aa9f" (UID: "91fccc6c-d0e9-4119-ac50-85572363aa9f"). InnerVolumeSpecName "kube-api-access-t64l4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:57:56.412231 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.412193 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-util" (OuterVolumeSpecName: "util") pod "91fccc6c-d0e9-4119-ac50-85572363aa9f" (UID: "91fccc6c-d0e9-4119-ac50-85572363aa9f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:57:56.504571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.504536 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t64l4\" (UniqueName: \"kubernetes.io/projected/91fccc6c-d0e9-4119-ac50-85572363aa9f-kube-api-access-t64l4\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:56.504571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.504564 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:56.504571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.504574 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91fccc6c-d0e9-4119-ac50-85572363aa9f-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:57:56.712301 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.712194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" event={"ID":"91fccc6c-d0e9-4119-ac50-85572363aa9f","Type":"ContainerDied","Data":"c266f6b9dda7ff6384e16326d57f3be009b97151312efe764b610b8dabfa20d8"} Apr 20 07:57:56.712301 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.712238 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt968" Apr 20 07:57:56.712301 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.712256 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c266f6b9dda7ff6384e16326d57f3be009b97151312efe764b610b8dabfa20d8" Apr 20 07:57:56.713715 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.713687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" event={"ID":"347d9aac-7ac6-44f2-b44e-dd9b37d26eab","Type":"ContainerStarted","Data":"9c6a0182cd41324a14d860d08d5d5170fc07c312667d58af01d7a20db686e246"} Apr 20 07:57:56.713847 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.713827 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:57:56.734344 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:57:56.734296 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" podStartSLOduration=1.34647818 podStartE2EDuration="3.734285483s" podCreationTimestamp="2026-04-20 07:57:53 +0000 UTC" firstStartedPulling="2026-04-20 07:57:53.963355635 +0000 UTC m=+467.259928597" lastFinishedPulling="2026-04-20 07:57:56.351162934 +0000 UTC m=+469.647735900" observedRunningTime="2026-04-20 07:57:56.732539124 +0000 UTC m=+470.029112109" watchObservedRunningTime="2026-04-20 07:57:56.734285483 +0000 UTC m=+470.030858467" Apr 20 07:58:07.719759 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:07.719728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-q6blb" Apr 20 07:58:10.276146 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276110 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr"] Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276383 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="util" Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276393 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="util" Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276403 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="extract" Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276409 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="extract" Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276415 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="pull" Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276421 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="pull" Apr 20 07:58:10.276547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.276464 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="91fccc6c-d0e9-4119-ac50-85572363aa9f" containerName="extract" Apr 20 07:58:10.283084 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.283062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.285454 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.285430 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:58:10.285803 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.285787 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bht96\"" Apr 20 07:58:10.286574 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.286554 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:58:10.292622 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.292602 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr"] Apr 20 07:58:10.413654 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.413619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.413820 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.413679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.413820 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.413735 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2w8b\" (UniqueName: \"kubernetes.io/projected/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-kube-api-access-w2w8b\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.514668 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.514635 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.514810 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.514682 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2w8b\" (UniqueName: \"kubernetes.io/projected/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-kube-api-access-w2w8b\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.514810 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.514707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.515023 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.515009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.515075 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.515056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.521847 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.521824 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2w8b\" (UniqueName: \"kubernetes.io/projected/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-kube-api-access-w2w8b\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.593774 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.593680 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:10.717349 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.717322 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr"] Apr 20 07:58:10.720106 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:10.720079 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee2eb2e_47ec_4f0d_8605_5e506cf9befe.slice/crio-ea13c4acf58cc8aa203bdcc28a0505050610a1b2d5fe254b34f5fda029615f65 WatchSource:0}: Error finding container ea13c4acf58cc8aa203bdcc28a0505050610a1b2d5fe254b34f5fda029615f65: Status 404 returned error can't find the container with id ea13c4acf58cc8aa203bdcc28a0505050610a1b2d5fe254b34f5fda029615f65 Apr 20 07:58:10.759621 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:10.759592 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" event={"ID":"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe","Type":"ContainerStarted","Data":"ea13c4acf58cc8aa203bdcc28a0505050610a1b2d5fe254b34f5fda029615f65"} Apr 20 07:58:11.430442 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.430404 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg"] Apr 20 07:58:11.433759 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.433737 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.435513 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.435486 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 07:58:11.435513 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.435502 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 07:58:11.435872 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.435853 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k5w9z\"" Apr 20 07:58:11.436023 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.436005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 07:58:11.436072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.436028 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 07:58:11.442380 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.442360 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg"] Apr 20 07:58:11.523993 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.523959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46fb\" (UniqueName: \"kubernetes.io/projected/fb2fe040-660d-4160-92fd-18d18497d727-kube-api-access-r46fb\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.523993 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.523996 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb2fe040-660d-4160-92fd-18d18497d727-tmp\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.524237 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.524086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2fe040-660d-4160-92fd-18d18497d727-tls-certs\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.625493 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.625456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r46fb\" (UniqueName: \"kubernetes.io/projected/fb2fe040-660d-4160-92fd-18d18497d727-kube-api-access-r46fb\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.625493 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.625493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb2fe040-660d-4160-92fd-18d18497d727-tmp\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.625689 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.625520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2fe040-660d-4160-92fd-18d18497d727-tls-certs\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.627740 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.627721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb2fe040-660d-4160-92fd-18d18497d727-tmp\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.627961 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.627941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2fe040-660d-4160-92fd-18d18497d727-tls-certs\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.632547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.632524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46fb\" (UniqueName: \"kubernetes.io/projected/fb2fe040-660d-4160-92fd-18d18497d727-kube-api-access-r46fb\") pod \"kube-auth-proxy-7b9784c649-4nbqg\" (UID: \"fb2fe040-660d-4160-92fd-18d18497d727\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.744684 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.744614 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" Apr 20 07:58:11.764601 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.764566 2569 generic.go:358] "Generic (PLEG): container finished" podID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerID="35e441b465f0d03b0efd2ecd7c2820bd6b29a00c301331c0af13bd83ec580561" exitCode=0 Apr 20 07:58:11.764719 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.764622 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" event={"ID":"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe","Type":"ContainerDied","Data":"35e441b465f0d03b0efd2ecd7c2820bd6b29a00c301331c0af13bd83ec580561"} Apr 20 07:58:11.858074 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:11.857993 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg"] Apr 20 07:58:11.860566 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:11.860540 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2fe040_660d_4160_92fd_18d18497d727.slice/crio-0985336c0b5dbadd16979bed8100186ad8c123ca0a95e55f4934c2698cc08d1e WatchSource:0}: Error finding container 0985336c0b5dbadd16979bed8100186ad8c123ca0a95e55f4934c2698cc08d1e: Status 404 returned error can't find the container with id 0985336c0b5dbadd16979bed8100186ad8c123ca0a95e55f4934c2698cc08d1e Apr 20 07:58:12.774777 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:12.774737 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" event={"ID":"fb2fe040-660d-4160-92fd-18d18497d727","Type":"ContainerStarted","Data":"0985336c0b5dbadd16979bed8100186ad8c123ca0a95e55f4934c2698cc08d1e"} Apr 20 07:58:12.777399 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:12.777371 2569 generic.go:358] "Generic (PLEG): container finished" podID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerID="242932c47eb0c01d2220be813f38c54c0a5f7aa0c1414430b61b0ee56182e6b2" exitCode=0 Apr 20 07:58:12.777535 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:12.777446 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" event={"ID":"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe","Type":"ContainerDied","Data":"242932c47eb0c01d2220be813f38c54c0a5f7aa0c1414430b61b0ee56182e6b2"} Apr 20 07:58:13.783998 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:13.783891 2569 generic.go:358] "Generic (PLEG): container finished" podID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerID="964de27c012dd1f8c60583d0733bc975d9f12ea0eead77eae7abdc317025bacc" exitCode=0 Apr 20 07:58:13.783998 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:13.783928 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" event={"ID":"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe","Type":"ContainerDied","Data":"964de27c012dd1f8c60583d0733bc975d9f12ea0eead77eae7abdc317025bacc"} Apr 20 07:58:14.366333 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.366294 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-8zsdw"] Apr 20 07:58:14.369768 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.369746 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:14.371618 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.371564 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 07:58:14.371733 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.371715 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-cg955\"" Apr 20 07:58:14.376822 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.376782 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-8zsdw"] Apr 20 07:58:14.447979 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.447944 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:14.448168 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.448005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnnz\" (UniqueName: \"kubernetes.io/projected/caddfbde-b868-43f4-bba6-1b6166d52d42-kube-api-access-qjnnz\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:14.548678 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.548644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:14.548863 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.548701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnnz\" (UniqueName: \"kubernetes.io/projected/caddfbde-b868-43f4-bba6-1b6166d52d42-kube-api-access-qjnnz\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:14.548863 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:14.548816 2569 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 07:58:14.548964 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:14.548894 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert podName:caddfbde-b868-43f4-bba6-1b6166d52d42 nodeName:}" failed. No retries permitted until 2026-04-20 07:58:15.04887421 +0000 UTC m=+488.345447177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert") pod "odh-model-controller-858dbf95b8-8zsdw" (UID: "caddfbde-b868-43f4-bba6-1b6166d52d42") : secret "odh-model-controller-webhook-cert" not found Apr 20 07:58:14.556785 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.556758 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnnz\" (UniqueName: \"kubernetes.io/projected/caddfbde-b868-43f4-bba6-1b6166d52d42-kube-api-access-qjnnz\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:14.982119 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:14.982093 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:15.053008 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.052984 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-bundle\") pod \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " Apr 20 07:58:15.053104 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.053020 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-util\") pod \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " Apr 20 07:58:15.053104 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.053052 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2w8b\" (UniqueName: \"kubernetes.io/projected/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-kube-api-access-w2w8b\") pod \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\" (UID: \"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe\") " Apr 20 07:58:15.053224 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.053177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:15.053336 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:15.053314 2569 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 07:58:15.053399 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:15.053364 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert podName:caddfbde-b868-43f4-bba6-1b6166d52d42 nodeName:}" failed. No retries permitted until 2026-04-20 07:58:16.053350666 +0000 UTC m=+489.349923629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert") pod "odh-model-controller-858dbf95b8-8zsdw" (UID: "caddfbde-b868-43f4-bba6-1b6166d52d42") : secret "odh-model-controller-webhook-cert" not found Apr 20 07:58:15.054045 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.054016 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-bundle" (OuterVolumeSpecName: "bundle") pod "6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" (UID: "6ee2eb2e-47ec-4f0d-8605-5e506cf9befe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:58:15.055011 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.054984 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-kube-api-access-w2w8b" (OuterVolumeSpecName: "kube-api-access-w2w8b") pod "6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" (UID: "6ee2eb2e-47ec-4f0d-8605-5e506cf9befe"). InnerVolumeSpecName "kube-api-access-w2w8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:58:15.058367 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.058345 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-util" (OuterVolumeSpecName: "util") pod "6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" (UID: "6ee2eb2e-47ec-4f0d-8605-5e506cf9befe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:58:15.154432 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.154391 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2w8b\" (UniqueName: \"kubernetes.io/projected/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-kube-api-access-w2w8b\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:58:15.154432 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.154426 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:58:15.154432 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.154438 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee2eb2e-47ec-4f0d-8605-5e506cf9befe-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:58:15.793413 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.793373 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" event={"ID":"fb2fe040-660d-4160-92fd-18d18497d727","Type":"ContainerStarted","Data":"ddda7036e93f48101c79280017214666381ee9e713e3739b055c3f2866e6fd15"} Apr 20 07:58:15.795133 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.795107 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" event={"ID":"6ee2eb2e-47ec-4f0d-8605-5e506cf9befe","Type":"ContainerDied","Data":"ea13c4acf58cc8aa203bdcc28a0505050610a1b2d5fe254b34f5fda029615f65"} Apr 20 07:58:15.795133 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.795135 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea13c4acf58cc8aa203bdcc28a0505050610a1b2d5fe254b34f5fda029615f65" Apr 20 07:58:15.795521 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.795140 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mczmr" Apr 20 07:58:15.810005 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:15.809964 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b9784c649-4nbqg" podStartSLOduration=1.650526956 podStartE2EDuration="4.809949296s" podCreationTimestamp="2026-04-20 07:58:11 +0000 UTC" firstStartedPulling="2026-04-20 07:58:11.862361731 +0000 UTC m=+485.158934694" lastFinishedPulling="2026-04-20 07:58:15.021784072 +0000 UTC m=+488.318357034" observedRunningTime="2026-04-20 07:58:15.808226738 +0000 UTC m=+489.104799715" watchObservedRunningTime="2026-04-20 07:58:15.809949296 +0000 UTC m=+489.106522280" Apr 20 07:58:16.062667 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:16.062571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:16.065037 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:16.065010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/caddfbde-b868-43f4-bba6-1b6166d52d42-cert\") pod \"odh-model-controller-858dbf95b8-8zsdw\" (UID: \"caddfbde-b868-43f4-bba6-1b6166d52d42\") " pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:16.182352 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:16.182310 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:16.301264 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:16.301236 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-8zsdw"] Apr 20 07:58:16.303412 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:16.303384 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaddfbde_b868_43f4_bba6_1b6166d52d42.slice/crio-4d6220b7064bade0a61d21f4f42ae5239e40995c0c473f16f90a0f0472cdd427 WatchSource:0}: Error finding container 4d6220b7064bade0a61d21f4f42ae5239e40995c0c473f16f90a0f0472cdd427: Status 404 returned error can't find the container with id 4d6220b7064bade0a61d21f4f42ae5239e40995c0c473f16f90a0f0472cdd427 Apr 20 07:58:16.801300 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:16.801249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" event={"ID":"caddfbde-b868-43f4-bba6-1b6166d52d42","Type":"ContainerStarted","Data":"4d6220b7064bade0a61d21f4f42ae5239e40995c0c473f16f90a0f0472cdd427"} Apr 20 07:58:18.809951 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:18.809920 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" event={"ID":"caddfbde-b868-43f4-bba6-1b6166d52d42","Type":"ContainerStarted","Data":"e5a349479027686b2ce3bbb1fedc2febea94d16c0c1c96b1ee2b938d53f549d1"} Apr 20 07:58:18.810374 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:18.810064 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:18.826283 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:18.826196 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" podStartSLOduration=2.389830392 podStartE2EDuration="4.82617635s" podCreationTimestamp="2026-04-20 07:58:14 +0000 UTC" firstStartedPulling="2026-04-20 07:58:16.304588129 +0000 UTC m=+489.601161092" lastFinishedPulling="2026-04-20 07:58:18.740934087 +0000 UTC m=+492.037507050" observedRunningTime="2026-04-20 07:58:18.825299346 +0000 UTC m=+492.121872331" watchObservedRunningTime="2026-04-20 07:58:18.82617635 +0000 UTC m=+492.122749335" Apr 20 07:58:19.814975 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.814938 2569 generic.go:358] "Generic (PLEG): container finished" podID="caddfbde-b868-43f4-bba6-1b6166d52d42" containerID="e5a349479027686b2ce3bbb1fedc2febea94d16c0c1c96b1ee2b938d53f549d1" exitCode=1 Apr 20 07:58:19.815408 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.814984 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" event={"ID":"caddfbde-b868-43f4-bba6-1b6166d52d42","Type":"ContainerDied","Data":"e5a349479027686b2ce3bbb1fedc2febea94d16c0c1c96b1ee2b938d53f549d1"} Apr 20 07:58:19.815408 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.815326 2569 scope.go:117] "RemoveContainer" containerID="e5a349479027686b2ce3bbb1fedc2febea94d16c0c1c96b1ee2b938d53f549d1" Apr 20 07:58:19.823905 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.823882 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-gnct7"] Apr 20 07:58:19.824156 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824144 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="extract" Apr 20 07:58:19.824224 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824158 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="extract" Apr 20 07:58:19.824224 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824168 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="pull" Apr 20 07:58:19.824224 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824174 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="pull" Apr 20 07:58:19.824224 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824193 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="util" Apr 20 07:58:19.824224 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824199 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="util" Apr 20 07:58:19.824396 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.824255 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ee2eb2e-47ec-4f0d-8605-5e506cf9befe" containerName="extract" Apr 20 07:58:19.826711 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.826694 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:19.828438 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.828416 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 07:58:19.828540 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.828422 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-dd7mw\"" Apr 20 07:58:19.834161 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.834136 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-gnct7"] Apr 20 07:58:19.891985 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.891954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:19.892171 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.892014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dckwz\" (UniqueName: \"kubernetes.io/projected/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-kube-api-access-dckwz\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:19.993055 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.993025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:19.993055 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:19.993059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dckwz\" (UniqueName: \"kubernetes.io/projected/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-kube-api-access-dckwz\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:19.993310 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:19.993187 2569 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 07:58:19.993310 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:19.993286 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert podName:efe4b4d1-5b5f-4188-8c66-d364d4c15d89 nodeName:}" failed. No retries permitted until 2026-04-20 07:58:20.493263049 +0000 UTC m=+493.789836031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert") pod "kserve-controller-manager-856948b99f-gnct7" (UID: "efe4b4d1-5b5f-4188-8c66-d364d4c15d89") : secret "kserve-webhook-server-cert" not found Apr 20 07:58:20.002202 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:20.002178 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dckwz\" (UniqueName: \"kubernetes.io/projected/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-kube-api-access-dckwz\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:20.498375 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:20.498277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:20.498525 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:20.498409 2569 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 07:58:20.498525 ip-10-0-138-4 kubenswrapper[2569]: E0420 07:58:20.498466 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert podName:efe4b4d1-5b5f-4188-8c66-d364d4c15d89 nodeName:}" failed. No retries permitted until 2026-04-20 07:58:21.498450974 +0000 UTC m=+494.795023936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert") pod "kserve-controller-manager-856948b99f-gnct7" (UID: "efe4b4d1-5b5f-4188-8c66-d364d4c15d89") : secret "kserve-webhook-server-cert" not found Apr 20 07:58:20.819601 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:20.819562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" event={"ID":"caddfbde-b868-43f4-bba6-1b6166d52d42","Type":"ContainerStarted","Data":"20b1b365c536949f74ece20d77ce7ae6f0c70da2e41a2579880db6d538cd91e2"} Apr 20 07:58:20.819970 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:20.819673 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:21.507298 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:21.507261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:21.509608 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:21.509585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efe4b4d1-5b5f-4188-8c66-d364d4c15d89-cert\") pod \"kserve-controller-manager-856948b99f-gnct7\" (UID: \"efe4b4d1-5b5f-4188-8c66-d364d4c15d89\") " pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:21.638350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:21.638304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:21.756266 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:21.756240 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-gnct7"] Apr 20 07:58:21.758393 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:21.758331 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe4b4d1_5b5f_4188_8c66_d364d4c15d89.slice/crio-941ed57252efa292ae0ec89da11aaff8d849adaecf7ede192753c64e2c192216 WatchSource:0}: Error finding container 941ed57252efa292ae0ec89da11aaff8d849adaecf7ede192753c64e2c192216: Status 404 returned error can't find the container with id 941ed57252efa292ae0ec89da11aaff8d849adaecf7ede192753c64e2c192216 Apr 20 07:58:21.824243 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:21.824199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" event={"ID":"efe4b4d1-5b5f-4188-8c66-d364d4c15d89","Type":"ContainerStarted","Data":"941ed57252efa292ae0ec89da11aaff8d849adaecf7ede192753c64e2c192216"} Apr 20 07:58:24.335793 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.335751 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f"] Apr 20 07:58:24.339421 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.339398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.342170 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.342149 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 07:58:24.342652 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.342635 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bht96\"" Apr 20 07:58:24.342732 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.342641 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 07:58:24.363328 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.363302 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f"] Apr 20 07:58:24.433121 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.433091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.433315 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.433164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.433315 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.433233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hw9\" (UniqueName: \"kubernetes.io/projected/788b1ffd-7514-4094-9925-b5d584a33362-kube-api-access-c7hw9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.534592 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.534557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.534764 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.534599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hw9\" (UniqueName: \"kubernetes.io/projected/788b1ffd-7514-4094-9925-b5d584a33362-kube-api-access-c7hw9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.534764 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.534635 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.535030 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.535005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.535106 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.535054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.542627 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.542603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hw9\" (UniqueName: \"kubernetes.io/projected/788b1ffd-7514-4094-9925-b5d584a33362-kube-api-access-c7hw9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.648036 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.647958 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:24.769746 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.769705 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f"] Apr 20 07:58:24.773934 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:24.773904 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788b1ffd_7514_4094_9925_b5d584a33362.slice/crio-9a9cd1c5eb09f4bb0f3c98fcd398a3193a9a6e99fd8f768cb43b276d2a507d29 WatchSource:0}: Error finding container 9a9cd1c5eb09f4bb0f3c98fcd398a3193a9a6e99fd8f768cb43b276d2a507d29: Status 404 returned error can't find the container with id 9a9cd1c5eb09f4bb0f3c98fcd398a3193a9a6e99fd8f768cb43b276d2a507d29 Apr 20 07:58:24.834335 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.834245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" event={"ID":"efe4b4d1-5b5f-4188-8c66-d364d4c15d89","Type":"ContainerStarted","Data":"ddb0d21b021665685932b9cf776b83a9edc948cde1d6b060d0a35f94885d6118"} Apr 20 07:58:24.834463 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.834445 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:58:24.835702 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.835683 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" event={"ID":"788b1ffd-7514-4094-9925-b5d584a33362","Type":"ContainerStarted","Data":"9a9cd1c5eb09f4bb0f3c98fcd398a3193a9a6e99fd8f768cb43b276d2a507d29"} Apr 20 07:58:24.852994 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:24.852945 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" podStartSLOduration=3.844672271 podStartE2EDuration="5.852932408s" podCreationTimestamp="2026-04-20 07:58:19 +0000 UTC" firstStartedPulling="2026-04-20 07:58:21.759772148 +0000 UTC m=+495.056345114" lastFinishedPulling="2026-04-20 07:58:23.768032274 +0000 UTC m=+497.064605251" observedRunningTime="2026-04-20 07:58:24.852039078 +0000 UTC m=+498.148612061" watchObservedRunningTime="2026-04-20 07:58:24.852932408 +0000 UTC m=+498.149505392" Apr 20 07:58:25.641442 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.641404 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq"] Apr 20 07:58:25.644293 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.644278 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.647610 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.647586 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-xmnss\"" Apr 20 07:58:25.648383 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.648363 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 07:58:25.649146 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.649126 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 07:58:25.657793 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.657769 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq"] Apr 20 07:58:25.745248 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.745193 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vw4pq\" (UID: \"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.745411 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.745264 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6pw8\" (UniqueName: \"kubernetes.io/projected/1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0-kube-api-access-z6pw8\") pod \"servicemesh-operator3-55f49c5f94-vw4pq\" (UID: \"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.840358 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.840325 2569 generic.go:358] "Generic (PLEG): container finished" podID="788b1ffd-7514-4094-9925-b5d584a33362" containerID="85de95faae9a4cc8d9bb0e952069a3c0755f1c7353ae8e7f60077c9064959695" exitCode=0 Apr 20 07:58:25.840519 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.840414 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" event={"ID":"788b1ffd-7514-4094-9925-b5d584a33362","Type":"ContainerDied","Data":"85de95faae9a4cc8d9bb0e952069a3c0755f1c7353ae8e7f60077c9064959695"} Apr 20 07:58:25.846267 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.846235 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vw4pq\" (UID: \"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.846352 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.846289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6pw8\" (UniqueName: \"kubernetes.io/projected/1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0-kube-api-access-z6pw8\") pod \"servicemesh-operator3-55f49c5f94-vw4pq\" (UID: \"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.848688 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.848667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vw4pq\" (UID: \"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.855413 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.855391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6pw8\" (UniqueName: \"kubernetes.io/projected/1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0-kube-api-access-z6pw8\") pod \"servicemesh-operator3-55f49c5f94-vw4pq\" (UID: \"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:25.954077 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:25.954007 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:26.079321 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:26.079291 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq"] Apr 20 07:58:26.080389 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:26.080358 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b7fa17f_4f2f_4c6a_ae55_375b5dcdcde0.slice/crio-52e7a22ce08ce69dcd19b74f248c4d4cfb860e4cca4d4b9844667b1a012b4f0b WatchSource:0}: Error finding container 52e7a22ce08ce69dcd19b74f248c4d4cfb860e4cca4d4b9844667b1a012b4f0b: Status 404 returned error can't find the container with id 52e7a22ce08ce69dcd19b74f248c4d4cfb860e4cca4d4b9844667b1a012b4f0b Apr 20 07:58:26.845387 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:26.845351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" event={"ID":"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0","Type":"ContainerStarted","Data":"52e7a22ce08ce69dcd19b74f248c4d4cfb860e4cca4d4b9844667b1a012b4f0b"} Apr 20 07:58:27.851154 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:27.851115 2569 generic.go:358] "Generic (PLEG): container finished" podID="788b1ffd-7514-4094-9925-b5d584a33362" containerID="e0a1dd274a153333c2ac488c059d37cb2db04b9cf79f0bffbc4acb641438add1" exitCode=0 Apr 20 07:58:27.851569 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:27.851187 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" event={"ID":"788b1ffd-7514-4094-9925-b5d584a33362","Type":"ContainerDied","Data":"e0a1dd274a153333c2ac488c059d37cb2db04b9cf79f0bffbc4acb641438add1"} Apr 20 07:58:28.856725 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:28.856688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" event={"ID":"1b7fa17f-4f2f-4c6a-ae55-375b5dcdcde0","Type":"ContainerStarted","Data":"360ff6a47e93680646b54d8efd09edda52b27cb97250a4301fba75d138d47126"} Apr 20 07:58:28.857091 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:28.857068 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:28.859081 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:28.859058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" event={"ID":"788b1ffd-7514-4094-9925-b5d584a33362","Type":"ContainerStarted","Data":"433763005d505ef0940489cba828bea3b6f3933278ab8ef0c209a8c83a3374b1"} Apr 20 07:58:28.874921 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:28.874875 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" podStartSLOduration=1.172115486 podStartE2EDuration="3.874858403s" podCreationTimestamp="2026-04-20 07:58:25 +0000 UTC" firstStartedPulling="2026-04-20 07:58:26.083115472 +0000 UTC m=+499.379688439" lastFinishedPulling="2026-04-20 07:58:28.785858391 +0000 UTC m=+502.082431356" observedRunningTime="2026-04-20 07:58:28.874223219 +0000 UTC m=+502.170796201" watchObservedRunningTime="2026-04-20 07:58:28.874858403 +0000 UTC m=+502.171431388" Apr 20 07:58:28.891372 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:28.890921 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" podStartSLOduration=3.758696102 podStartE2EDuration="4.890902297s" podCreationTimestamp="2026-04-20 07:58:24 +0000 UTC" firstStartedPulling="2026-04-20 07:58:25.841564669 +0000 UTC m=+499.138137631" lastFinishedPulling="2026-04-20 07:58:26.97377086 +0000 UTC m=+500.270343826" observedRunningTime="2026-04-20 07:58:28.888528117 +0000 UTC m=+502.185101103" watchObservedRunningTime="2026-04-20 07:58:28.890902297 +0000 UTC m=+502.187475284" Apr 20 07:58:29.864160 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:29.864123 2569 generic.go:358] "Generic (PLEG): container finished" podID="788b1ffd-7514-4094-9925-b5d584a33362" containerID="433763005d505ef0940489cba828bea3b6f3933278ab8ef0c209a8c83a3374b1" exitCode=0 Apr 20 07:58:29.864630 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:29.864230 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" event={"ID":"788b1ffd-7514-4094-9925-b5d584a33362","Type":"ContainerDied","Data":"433763005d505ef0940489cba828bea3b6f3933278ab8ef0c209a8c83a3374b1"} Apr 20 07:58:30.991389 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:30.991367 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:31.100835 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.100789 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7hw9\" (UniqueName: \"kubernetes.io/projected/788b1ffd-7514-4094-9925-b5d584a33362-kube-api-access-c7hw9\") pod \"788b1ffd-7514-4094-9925-b5d584a33362\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " Apr 20 07:58:31.101022 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.100998 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-util\") pod \"788b1ffd-7514-4094-9925-b5d584a33362\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " Apr 20 07:58:31.101071 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.101046 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-bundle\") pod \"788b1ffd-7514-4094-9925-b5d584a33362\" (UID: \"788b1ffd-7514-4094-9925-b5d584a33362\") " Apr 20 07:58:31.101974 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.101945 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-bundle" (OuterVolumeSpecName: "bundle") pod "788b1ffd-7514-4094-9925-b5d584a33362" (UID: "788b1ffd-7514-4094-9925-b5d584a33362"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:58:31.102930 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.102907 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788b1ffd-7514-4094-9925-b5d584a33362-kube-api-access-c7hw9" (OuterVolumeSpecName: "kube-api-access-c7hw9") pod "788b1ffd-7514-4094-9925-b5d584a33362" (UID: "788b1ffd-7514-4094-9925-b5d584a33362"). InnerVolumeSpecName "kube-api-access-c7hw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:58:31.105025 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.104985 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-util" (OuterVolumeSpecName: "util") pod "788b1ffd-7514-4094-9925-b5d584a33362" (UID: "788b1ffd-7514-4094-9925-b5d584a33362"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:58:31.202542 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.202447 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:58:31.202542 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.202489 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/788b1ffd-7514-4094-9925-b5d584a33362-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:58:31.202542 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.202499 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7hw9\" (UniqueName: \"kubernetes.io/projected/788b1ffd-7514-4094-9925-b5d584a33362-kube-api-access-c7hw9\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:58:31.851701 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.851673 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-8zsdw" Apr 20 07:58:31.872093 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.872067 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" Apr 20 07:58:31.872276 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.872096 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vjl4f" event={"ID":"788b1ffd-7514-4094-9925-b5d584a33362","Type":"ContainerDied","Data":"9a9cd1c5eb09f4bb0f3c98fcd398a3193a9a6e99fd8f768cb43b276d2a507d29"} Apr 20 07:58:31.872276 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:31.872131 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a9cd1c5eb09f4bb0f3c98fcd398a3193a9a6e99fd8f768cb43b276d2a507d29" Apr 20 07:58:39.866906 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:39.866873 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vw4pq" Apr 20 07:58:40.932910 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.932875 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb"] Apr 20 07:58:40.933286 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933182 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="extract" Apr 20 07:58:40.933286 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933194 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="extract" Apr 20 07:58:40.933286 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933218 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="pull" Apr 20 07:58:40.933286 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933224 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="pull" Apr 20 07:58:40.933286 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933242 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="util" Apr 20 07:58:40.933286 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933248 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="util" Apr 20 07:58:40.933474 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.933293 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="788b1ffd-7514-4094-9925-b5d584a33362" containerName="extract" Apr 20 07:58:40.937830 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.937809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:40.939955 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.939932 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-szlp7\"" Apr 20 07:58:40.940152 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.940128 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 07:58:40.940269 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.940153 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 07:58:40.940269 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.940128 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 07:58:40.940382 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.940264 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 07:58:40.947926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:40.947905 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb"] Apr 20 07:58:41.080076 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080042 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.080271 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.080271 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.080271 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080165 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.080387 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6r2m\" (UniqueName: \"kubernetes.io/projected/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-kube-api-access-l6r2m\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.080387 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080298 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.080387 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.080338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.181570 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.181779 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.181779 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6r2m\" (UniqueName: \"kubernetes.io/projected/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-kube-api-access-l6r2m\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.181916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.181916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.181916 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.182048 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.181945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.182746 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.182690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.184233 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.184148 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.184532 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.184502 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.184610 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.184581 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.184680 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.184660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.207134 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.207099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6r2m\" (UniqueName: \"kubernetes.io/projected/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-kube-api-access-l6r2m\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.207547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.207523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/24c37dbe-4cd8-4313-adfd-4d04f39bd0d6-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xprjb\" (UID: \"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.249465 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.249434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:41.399467 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.399439 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb"] Apr 20 07:58:41.400568 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:58:41.400543 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c37dbe_4cd8_4313_adfd_4d04f39bd0d6.slice/crio-26b6cc7092b6a2778d4190d52eb4a97f37693503c986475cbd9b8b8abda03f8e WatchSource:0}: Error finding container 26b6cc7092b6a2778d4190d52eb4a97f37693503c986475cbd9b8b8abda03f8e: Status 404 returned error can't find the container with id 26b6cc7092b6a2778d4190d52eb4a97f37693503c986475cbd9b8b8abda03f8e Apr 20 07:58:41.910587 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:41.910545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" event={"ID":"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6","Type":"ContainerStarted","Data":"26b6cc7092b6a2778d4190d52eb4a97f37693503c986475cbd9b8b8abda03f8e"} Apr 20 07:58:43.801510 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:43.801458 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 07:58:43.801745 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:43.801551 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 07:58:43.918723 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:43.918683 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" event={"ID":"24c37dbe-4cd8-4313-adfd-4d04f39bd0d6","Type":"ContainerStarted","Data":"89312d78640047bb2541d0b68faaf76eee7ab97fde445d844e7e4d2c77145af3"} Apr 20 07:58:43.918906 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:43.918757 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:43.936853 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:43.936800 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" podStartSLOduration=1.538444212 podStartE2EDuration="3.936784814s" podCreationTimestamp="2026-04-20 07:58:40 +0000 UTC" firstStartedPulling="2026-04-20 07:58:41.402906294 +0000 UTC m=+514.699479256" lastFinishedPulling="2026-04-20 07:58:43.801246894 +0000 UTC m=+517.097819858" observedRunningTime="2026-04-20 07:58:43.935266797 +0000 UTC m=+517.231839783" watchObservedRunningTime="2026-04-20 07:58:43.936784814 +0000 UTC m=+517.233357797" Apr 20 07:58:44.924327 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:44.924284 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-xprjb container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 07:58:44.924782 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:44.924373 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" podUID="24c37dbe-4cd8-4313-adfd-4d04f39bd0d6" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 07:58:47.923381 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:47.923350 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xprjb" Apr 20 07:58:55.845778 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:58:55.845741 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-gnct7" Apr 20 07:59:26.610958 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.610922 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t"] Apr 20 07:59:26.614377 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.614361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.616220 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.616190 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:59:26.616321 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.616203 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:59:26.616619 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.616601 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7v55j\"" Apr 20 07:59:26.620744 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.620721 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t"] Apr 20 07:59:26.756600 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.756565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.756600 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.756600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.756803 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.756637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr8fh\" (UniqueName: \"kubernetes.io/projected/dfb1b096-0c6b-4634-a22b-de3317e5ca49-kube-api-access-jr8fh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.857578 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.857542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.857578 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.857579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.857834 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.857618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr8fh\" (UniqueName: \"kubernetes.io/projected/dfb1b096-0c6b-4634-a22b-de3317e5ca49-kube-api-access-jr8fh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.857929 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.857911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.857968 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.857943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.864560 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.864497 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr8fh\" (UniqueName: \"kubernetes.io/projected/dfb1b096-0c6b-4634-a22b-de3317e5ca49-kube-api-access-jr8fh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:26.924250 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:26.924197 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:27.042343 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.042304 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t"] Apr 20 07:59:27.049263 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:59:27.049229 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb1b096_0c6b_4634_a22b_de3317e5ca49.slice/crio-026fa2ca40b0d9fc3408e8369080d78d9da249b19ccbf842c1373f37a6443edd WatchSource:0}: Error finding container 026fa2ca40b0d9fc3408e8369080d78d9da249b19ccbf842c1373f37a6443edd: Status 404 returned error can't find the container with id 026fa2ca40b0d9fc3408e8369080d78d9da249b19ccbf842c1373f37a6443edd Apr 20 07:59:27.072433 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.072403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" event={"ID":"dfb1b096-0c6b-4634-a22b-de3317e5ca49","Type":"ContainerStarted","Data":"026fa2ca40b0d9fc3408e8369080d78d9da249b19ccbf842c1373f37a6443edd"} Apr 20 07:59:27.211833 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.211793 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc"] Apr 20 07:59:27.215392 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.215368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.221363 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.221337 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc"] Apr 20 07:59:27.361355 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.361311 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.361528 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.361373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8l2\" (UniqueName: \"kubernetes.io/projected/24a008ba-c82d-48df-9a39-aa2a17fe63f4-kube-api-access-5s8l2\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.361528 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.361412 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.462162 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.462074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.462162 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.462117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8l2\" (UniqueName: \"kubernetes.io/projected/24a008ba-c82d-48df-9a39-aa2a17fe63f4-kube-api-access-5s8l2\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.462162 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.462137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.462496 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.462478 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.462549 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.462530 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.469547 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.469521 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8l2\" (UniqueName: \"kubernetes.io/projected/24a008ba-c82d-48df-9a39-aa2a17fe63f4-kube-api-access-5s8l2\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.526652 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.526621 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:27.643230 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.642718 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc"] Apr 20 07:59:27.645397 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.645367 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz"] Apr 20 07:59:27.650106 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.650084 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.654955 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.654932 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz"] Apr 20 07:59:27.655457 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:59:27.655433 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a008ba_c82d_48df_9a39_aa2a17fe63f4.slice/crio-8f573c59bcaaf54ffa3948b160742bea77fb694f1f6d1b67bbe7d38a18086498 WatchSource:0}: Error finding container 8f573c59bcaaf54ffa3948b160742bea77fb694f1f6d1b67bbe7d38a18086498: Status 404 returned error can't find the container with id 8f573c59bcaaf54ffa3948b160742bea77fb694f1f6d1b67bbe7d38a18086498 Apr 20 07:59:27.764475 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.764446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.764601 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.764487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z648s\" (UniqueName: \"kubernetes.io/projected/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-kube-api-access-z648s\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.764639 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.764598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.865766 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.865727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.865946 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.865773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.865946 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.865886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z648s\" (UniqueName: \"kubernetes.io/projected/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-kube-api-access-z648s\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.866161 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.866132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.866243 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.866141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.873676 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.873652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z648s\" (UniqueName: \"kubernetes.io/projected/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-kube-api-access-z648s\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:27.962146 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:27.962113 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:28.020527 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.020496 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm"] Apr 20 07:59:28.025137 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.025113 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.030939 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.030910 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm"] Apr 20 07:59:28.077076 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.077042 2569 generic.go:358] "Generic (PLEG): container finished" podID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerID="8d8d25e9ef75dce10b99ad2aade5a9e386651859b724de67370091c8da405d42" exitCode=0 Apr 20 07:59:28.077261 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.077135 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" event={"ID":"dfb1b096-0c6b-4634-a22b-de3317e5ca49","Type":"ContainerDied","Data":"8d8d25e9ef75dce10b99ad2aade5a9e386651859b724de67370091c8da405d42"} Apr 20 07:59:28.078584 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.078564 2569 generic.go:358] "Generic (PLEG): container finished" podID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerID="d3adcf3c5b2b1723b2a979749d958f8ce4621da674112a3935dcb1d4fe4a6b9a" exitCode=0 Apr 20 07:59:28.078689 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.078611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" event={"ID":"24a008ba-c82d-48df-9a39-aa2a17fe63f4","Type":"ContainerDied","Data":"d3adcf3c5b2b1723b2a979749d958f8ce4621da674112a3935dcb1d4fe4a6b9a"} Apr 20 07:59:28.078689 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.078626 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" event={"ID":"24a008ba-c82d-48df-9a39-aa2a17fe63f4","Type":"ContainerStarted","Data":"8f573c59bcaaf54ffa3948b160742bea77fb694f1f6d1b67bbe7d38a18086498"} Apr 20 07:59:28.084498 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.084476 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz"] Apr 20 07:59:28.087514 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:59:28.087466 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03b4b3e_46c1_4df2_ad89_b51387e90d5f.slice/crio-c47713f868e78b4e9fafed584e55b9f5dccbde40f13b0765f153e0781be7edbd WatchSource:0}: Error finding container c47713f868e78b4e9fafed584e55b9f5dccbde40f13b0765f153e0781be7edbd: Status 404 returned error can't find the container with id c47713f868e78b4e9fafed584e55b9f5dccbde40f13b0765f153e0781be7edbd Apr 20 07:59:28.168557 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.168508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.168557 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.168567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.168825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.168635 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9z5d\" (UniqueName: \"kubernetes.io/projected/a0f8ffbf-0872-4075-b182-a5ad51501cfd-kube-api-access-q9z5d\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.269089 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.269057 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9z5d\" (UniqueName: \"kubernetes.io/projected/a0f8ffbf-0872-4075-b182-a5ad51501cfd-kube-api-access-q9z5d\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.269294 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.269167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.269294 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.269241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.269563 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.269544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.269618 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.269597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.276906 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.276884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9z5d\" (UniqueName: \"kubernetes.io/projected/a0f8ffbf-0872-4075-b182-a5ad51501cfd-kube-api-access-q9z5d\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.336786 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.336750 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:28.456412 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:28.456377 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm"] Apr 20 07:59:28.458047 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:59:28.458020 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f8ffbf_0872_4075_b182_a5ad51501cfd.slice/crio-0f4c7d1a2614ca03a906bc3098b743280ddbf980416b8577d0557399627cbbb5 WatchSource:0}: Error finding container 0f4c7d1a2614ca03a906bc3098b743280ddbf980416b8577d0557399627cbbb5: Status 404 returned error can't find the container with id 0f4c7d1a2614ca03a906bc3098b743280ddbf980416b8577d0557399627cbbb5 Apr 20 07:59:29.083854 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.083826 2569 generic.go:358] "Generic (PLEG): container finished" podID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerID="7e8a95de71834436c3ee7c5423b06849b976af41fbcc043ab2cff33e481eb39d" exitCode=0 Apr 20 07:59:29.084201 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.083904 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" event={"ID":"24a008ba-c82d-48df-9a39-aa2a17fe63f4","Type":"ContainerDied","Data":"7e8a95de71834436c3ee7c5423b06849b976af41fbcc043ab2cff33e481eb39d"} Apr 20 07:59:29.085374 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.085354 2569 generic.go:358] "Generic (PLEG): container finished" podID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerID="33bb69fe93f41fa75f3dbdb63eb7547d2255d8b66de874ad9953fd845c229684" exitCode=0 Apr 20 07:59:29.085488 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.085387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" event={"ID":"a0f8ffbf-0872-4075-b182-a5ad51501cfd","Type":"ContainerDied","Data":"33bb69fe93f41fa75f3dbdb63eb7547d2255d8b66de874ad9953fd845c229684"} Apr 20 07:59:29.085488 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.085423 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" event={"ID":"a0f8ffbf-0872-4075-b182-a5ad51501cfd","Type":"ContainerStarted","Data":"0f4c7d1a2614ca03a906bc3098b743280ddbf980416b8577d0557399627cbbb5"} Apr 20 07:59:29.086987 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.086967 2569 generic.go:358] "Generic (PLEG): container finished" podID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerID="b536579602aa1becd81eb64587b34f7ff203e092e73d67feeab8d69b806322ee" exitCode=0 Apr 20 07:59:29.087083 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.087056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" event={"ID":"e03b4b3e-46c1-4df2-ad89-b51387e90d5f","Type":"ContainerDied","Data":"b536579602aa1becd81eb64587b34f7ff203e092e73d67feeab8d69b806322ee"} Apr 20 07:59:29.087151 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.087083 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" event={"ID":"e03b4b3e-46c1-4df2-ad89-b51387e90d5f","Type":"ContainerStarted","Data":"c47713f868e78b4e9fafed584e55b9f5dccbde40f13b0765f153e0781be7edbd"} Apr 20 07:59:29.089060 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:29.089042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" event={"ID":"dfb1b096-0c6b-4634-a22b-de3317e5ca49","Type":"ContainerStarted","Data":"42976496a0fe65a2dc39633c46f2aa3d6a22310a602342e3faf323bc8cc997a3"} Apr 20 07:59:30.095105 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.095015 2569 generic.go:358] "Generic (PLEG): container finished" podID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerID="674340db445fc3283e038d0196862a29b3e0f5c701d8a9ab3663677fa05fade0" exitCode=0 Apr 20 07:59:30.095105 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.095059 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" event={"ID":"a0f8ffbf-0872-4075-b182-a5ad51501cfd","Type":"ContainerDied","Data":"674340db445fc3283e038d0196862a29b3e0f5c701d8a9ab3663677fa05fade0"} Apr 20 07:59:30.096889 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.096815 2569 generic.go:358] "Generic (PLEG): container finished" podID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerID="d43e2e3d4af6cd5dc44f10942c78055b9623d321e8b0c8f4704b541793c3a127" exitCode=0 Apr 20 07:59:30.096957 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.096887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" event={"ID":"e03b4b3e-46c1-4df2-ad89-b51387e90d5f","Type":"ContainerDied","Data":"d43e2e3d4af6cd5dc44f10942c78055b9623d321e8b0c8f4704b541793c3a127"} Apr 20 07:59:30.098654 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.098631 2569 generic.go:358] "Generic (PLEG): container finished" podID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerID="42976496a0fe65a2dc39633c46f2aa3d6a22310a602342e3faf323bc8cc997a3" exitCode=0 Apr 20 07:59:30.098738 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.098712 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" event={"ID":"dfb1b096-0c6b-4634-a22b-de3317e5ca49","Type":"ContainerDied","Data":"42976496a0fe65a2dc39633c46f2aa3d6a22310a602342e3faf323bc8cc997a3"} Apr 20 07:59:30.100592 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.100566 2569 generic.go:358] "Generic (PLEG): container finished" podID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerID="29d5b341912cb4762850276c29cd035821b8fc598be30be7512b042f39b953ef" exitCode=0 Apr 20 07:59:30.100675 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:30.100600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" event={"ID":"24a008ba-c82d-48df-9a39-aa2a17fe63f4","Type":"ContainerDied","Data":"29d5b341912cb4762850276c29cd035821b8fc598be30be7512b042f39b953ef"} Apr 20 07:59:31.105879 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.105841 2569 generic.go:358] "Generic (PLEG): container finished" podID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerID="9c90ccaaaceb1edbc3c01eae20637b510f0eb3d5bfff00dd081d9f7bb634b048" exitCode=0 Apr 20 07:59:31.106331 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.105925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" event={"ID":"e03b4b3e-46c1-4df2-ad89-b51387e90d5f","Type":"ContainerDied","Data":"9c90ccaaaceb1edbc3c01eae20637b510f0eb3d5bfff00dd081d9f7bb634b048"} Apr 20 07:59:31.107825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.107803 2569 generic.go:358] "Generic (PLEG): container finished" podID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerID="af14f4a1d54e825135847002182b1fd3af39eb251ae9479f11e02054efdc2085" exitCode=0 Apr 20 07:59:31.107939 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.107885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" event={"ID":"dfb1b096-0c6b-4634-a22b-de3317e5ca49","Type":"ContainerDied","Data":"af14f4a1d54e825135847002182b1fd3af39eb251ae9479f11e02054efdc2085"} Apr 20 07:59:31.109744 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.109725 2569 generic.go:358] "Generic (PLEG): container finished" podID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerID="dc569b1934277fea594933231e0910df81c793c1e284ba15fbb6f0ad278c589d" exitCode=0 Apr 20 07:59:31.109872 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.109857 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" event={"ID":"a0f8ffbf-0872-4075-b182-a5ad51501cfd","Type":"ContainerDied","Data":"dc569b1934277fea594933231e0910df81c793c1e284ba15fbb6f0ad278c589d"} Apr 20 07:59:31.237328 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.237301 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:31.396641 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.396606 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s8l2\" (UniqueName: \"kubernetes.io/projected/24a008ba-c82d-48df-9a39-aa2a17fe63f4-kube-api-access-5s8l2\") pod \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " Apr 20 07:59:31.396825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.396668 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-bundle\") pod \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " Apr 20 07:59:31.396825 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.396697 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-util\") pod \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\" (UID: \"24a008ba-c82d-48df-9a39-aa2a17fe63f4\") " Apr 20 07:59:31.397133 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.397099 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-bundle" (OuterVolumeSpecName: "bundle") pod "24a008ba-c82d-48df-9a39-aa2a17fe63f4" (UID: "24a008ba-c82d-48df-9a39-aa2a17fe63f4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:31.398839 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.398816 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a008ba-c82d-48df-9a39-aa2a17fe63f4-kube-api-access-5s8l2" (OuterVolumeSpecName: "kube-api-access-5s8l2") pod "24a008ba-c82d-48df-9a39-aa2a17fe63f4" (UID: "24a008ba-c82d-48df-9a39-aa2a17fe63f4"). InnerVolumeSpecName "kube-api-access-5s8l2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:59:31.401951 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.401905 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-util" (OuterVolumeSpecName: "util") pod "24a008ba-c82d-48df-9a39-aa2a17fe63f4" (UID: "24a008ba-c82d-48df-9a39-aa2a17fe63f4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:31.497926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.497843 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5s8l2\" (UniqueName: \"kubernetes.io/projected/24a008ba-c82d-48df-9a39-aa2a17fe63f4-kube-api-access-5s8l2\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:31.497926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.497898 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:31.497926 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:31.497908 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a008ba-c82d-48df-9a39-aa2a17fe63f4-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.114816 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.114790 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" Apr 20 07:59:32.115350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.114820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc" event={"ID":"24a008ba-c82d-48df-9a39-aa2a17fe63f4","Type":"ContainerDied","Data":"8f573c59bcaaf54ffa3948b160742bea77fb694f1f6d1b67bbe7d38a18086498"} Apr 20 07:59:32.115350 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.114857 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f573c59bcaaf54ffa3948b160742bea77fb694f1f6d1b67bbe7d38a18086498" Apr 20 07:59:32.279967 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.279940 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:32.283168 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.283148 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:32.307305 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.307285 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:32.405407 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405326 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr8fh\" (UniqueName: \"kubernetes.io/projected/dfb1b096-0c6b-4634-a22b-de3317e5ca49-kube-api-access-jr8fh\") pod \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " Apr 20 07:59:32.405407 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405381 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-util\") pod \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405414 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-bundle\") pod \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405447 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9z5d\" (UniqueName: \"kubernetes.io/projected/a0f8ffbf-0872-4075-b182-a5ad51501cfd-kube-api-access-q9z5d\") pod \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405465 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-bundle\") pod \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405487 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z648s\" (UniqueName: \"kubernetes.io/projected/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-kube-api-access-z648s\") pod \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\" (UID: \"e03b4b3e-46c1-4df2-ad89-b51387e90d5f\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405521 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-util\") pod \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405547 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-bundle\") pod \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\" (UID: \"dfb1b096-0c6b-4634-a22b-de3317e5ca49\") " Apr 20 07:59:32.405623 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.405581 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-util\") pod \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\" (UID: \"a0f8ffbf-0872-4075-b182-a5ad51501cfd\") " Apr 20 07:59:32.406072 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.406043 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-bundle" (OuterVolumeSpecName: "bundle") pod "e03b4b3e-46c1-4df2-ad89-b51387e90d5f" (UID: "e03b4b3e-46c1-4df2-ad89-b51387e90d5f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:32.406582 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.406504 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-bundle" (OuterVolumeSpecName: "bundle") pod "dfb1b096-0c6b-4634-a22b-de3317e5ca49" (UID: "dfb1b096-0c6b-4634-a22b-de3317e5ca49"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:32.406582 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.406524 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-bundle" (OuterVolumeSpecName: "bundle") pod "a0f8ffbf-0872-4075-b182-a5ad51501cfd" (UID: "a0f8ffbf-0872-4075-b182-a5ad51501cfd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:32.408122 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.408084 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-kube-api-access-z648s" (OuterVolumeSpecName: "kube-api-access-z648s") pod "e03b4b3e-46c1-4df2-ad89-b51387e90d5f" (UID: "e03b4b3e-46c1-4df2-ad89-b51387e90d5f"). InnerVolumeSpecName "kube-api-access-z648s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:59:32.408122 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.408093 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f8ffbf-0872-4075-b182-a5ad51501cfd-kube-api-access-q9z5d" (OuterVolumeSpecName: "kube-api-access-q9z5d") pod "a0f8ffbf-0872-4075-b182-a5ad51501cfd" (UID: "a0f8ffbf-0872-4075-b182-a5ad51501cfd"). InnerVolumeSpecName "kube-api-access-q9z5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:59:32.408577 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.408554 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb1b096-0c6b-4634-a22b-de3317e5ca49-kube-api-access-jr8fh" (OuterVolumeSpecName: "kube-api-access-jr8fh") pod "dfb1b096-0c6b-4634-a22b-de3317e5ca49" (UID: "dfb1b096-0c6b-4634-a22b-de3317e5ca49"). InnerVolumeSpecName "kube-api-access-jr8fh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:59:32.411500 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.411479 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-util" (OuterVolumeSpecName: "util") pod "e03b4b3e-46c1-4df2-ad89-b51387e90d5f" (UID: "e03b4b3e-46c1-4df2-ad89-b51387e90d5f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:32.411728 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.411705 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-util" (OuterVolumeSpecName: "util") pod "dfb1b096-0c6b-4634-a22b-de3317e5ca49" (UID: "dfb1b096-0c6b-4634-a22b-de3317e5ca49"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:32.412299 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.412279 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-util" (OuterVolumeSpecName: "util") pod "a0f8ffbf-0872-4075-b182-a5ad51501cfd" (UID: "a0f8ffbf-0872-4075-b182-a5ad51501cfd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:59:32.506528 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506499 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9z5d\" (UniqueName: \"kubernetes.io/projected/a0f8ffbf-0872-4075-b182-a5ad51501cfd-kube-api-access-q9z5d\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506528 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506522 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506528 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506532 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z648s\" (UniqueName: \"kubernetes.io/projected/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-kube-api-access-z648s\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506543 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506553 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfb1b096-0c6b-4634-a22b-de3317e5ca49-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506560 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f8ffbf-0872-4075-b182-a5ad51501cfd-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506568 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jr8fh\" (UniqueName: \"kubernetes.io/projected/dfb1b096-0c6b-4634-a22b-de3317e5ca49-kube-api-access-jr8fh\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506576 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-util\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:32.506722 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:32.506589 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e03b4b3e-46c1-4df2-ad89-b51387e90d5f-bundle\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 07:59:33.119820 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.119785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" event={"ID":"e03b4b3e-46c1-4df2-ad89-b51387e90d5f","Type":"ContainerDied","Data":"c47713f868e78b4e9fafed584e55b9f5dccbde40f13b0765f153e0781be7edbd"} Apr 20 07:59:33.119820 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.119805 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz" Apr 20 07:59:33.119820 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.119817 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c47713f868e78b4e9fafed584e55b9f5dccbde40f13b0765f153e0781be7edbd" Apr 20 07:59:33.121444 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.121425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" event={"ID":"dfb1b096-0c6b-4634-a22b-de3317e5ca49","Type":"ContainerDied","Data":"026fa2ca40b0d9fc3408e8369080d78d9da249b19ccbf842c1373f37a6443edd"} Apr 20 07:59:33.121561 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.121447 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026fa2ca40b0d9fc3408e8369080d78d9da249b19ccbf842c1373f37a6443edd" Apr 20 07:59:33.121561 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.121448 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t" Apr 20 07:59:33.123218 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.123180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" event={"ID":"a0f8ffbf-0872-4075-b182-a5ad51501cfd","Type":"ContainerDied","Data":"0f4c7d1a2614ca03a906bc3098b743280ddbf980416b8577d0557399627cbbb5"} Apr 20 07:59:33.123303 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.123220 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm" Apr 20 07:59:33.123303 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:33.123225 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4c7d1a2614ca03a906bc3098b743280ddbf980416b8577d0557399627cbbb5" Apr 20 07:59:46.763154 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763120 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zp5mg"] Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763458 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="pull" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763469 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="pull" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763482 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="util" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763487 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="util" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763495 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="extract" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763500 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="extract" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763506 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="extract" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763511 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="extract" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763524 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="util" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763529 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="util" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763537 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="util" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763542 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="util" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763547 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="extract" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763552 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="extract" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763559 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="pull" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763564 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="pull" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763571 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="pull" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763576 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="pull" Apr 20 07:59:46.763571 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763582 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="pull" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763587 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="pull" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763595 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="util" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763600 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="util" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763605 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="extract" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763610 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="extract" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763652 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="24a008ba-c82d-48df-9a39-aa2a17fe63f4" containerName="extract" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763662 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfb1b096-0c6b-4634-a22b-de3317e5ca49" containerName="extract" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763671 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e03b4b3e-46c1-4df2-ad89-b51387e90d5f" containerName="extract" Apr 20 07:59:46.764198 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.763677 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0f8ffbf-0872-4075-b182-a5ad51501cfd" containerName="extract" Apr 20 07:59:46.768094 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.768071 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 07:59:46.770104 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.770076 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-z77p5\"" Apr 20 07:59:46.770568 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.770550 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:59:46.770638 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.770559 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:59:46.775329 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.775307 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zp5mg"] Apr 20 07:59:46.809484 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.809451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwr2\" (UniqueName: \"kubernetes.io/projected/21ee5ad3-fc71-4930-9b45-2317ed0b800d-kube-api-access-ncwr2\") pod \"authorino-operator-657f44b778-zp5mg\" (UID: \"21ee5ad3-fc71-4930-9b45-2317ed0b800d\") " pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 07:59:46.910030 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.909993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwr2\" (UniqueName: \"kubernetes.io/projected/21ee5ad3-fc71-4930-9b45-2317ed0b800d-kube-api-access-ncwr2\") pod \"authorino-operator-657f44b778-zp5mg\" (UID: \"21ee5ad3-fc71-4930-9b45-2317ed0b800d\") " pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 07:59:46.923524 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:46.923494 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwr2\" (UniqueName: \"kubernetes.io/projected/21ee5ad3-fc71-4930-9b45-2317ed0b800d-kube-api-access-ncwr2\") pod \"authorino-operator-657f44b778-zp5mg\" (UID: \"21ee5ad3-fc71-4930-9b45-2317ed0b800d\") " pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 07:59:47.081354 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:47.081263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 07:59:47.200339 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:47.200315 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zp5mg"] Apr 20 07:59:47.202118 ip-10-0-138-4 kubenswrapper[2569]: W0420 07:59:47.202091 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ee5ad3_fc71_4930_9b45_2317ed0b800d.slice/crio-e307d591a3342f141df260f83a62becd8e52488fbac342030ebde9135b900c1c WatchSource:0}: Error finding container e307d591a3342f141df260f83a62becd8e52488fbac342030ebde9135b900c1c: Status 404 returned error can't find the container with id e307d591a3342f141df260f83a62becd8e52488fbac342030ebde9135b900c1c Apr 20 07:59:48.187909 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:48.187863 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" event={"ID":"21ee5ad3-fc71-4930-9b45-2317ed0b800d","Type":"ContainerStarted","Data":"e307d591a3342f141df260f83a62becd8e52488fbac342030ebde9135b900c1c"} Apr 20 07:59:49.193490 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:49.193454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" event={"ID":"21ee5ad3-fc71-4930-9b45-2317ed0b800d","Type":"ContainerStarted","Data":"ef9ef066d9e14647ff56aa1c551ecff3be4296a74a9303403a88b19828be713d"} Apr 20 07:59:49.193882 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:49.193533 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 07:59:49.209450 ip-10-0-138-4 kubenswrapper[2569]: I0420 07:59:49.209399 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" podStartSLOduration=1.708044133 podStartE2EDuration="3.209386441s" podCreationTimestamp="2026-04-20 07:59:46 +0000 UTC" firstStartedPulling="2026-04-20 07:59:47.203974361 +0000 UTC m=+580.500547323" lastFinishedPulling="2026-04-20 07:59:48.705316666 +0000 UTC m=+582.001889631" observedRunningTime="2026-04-20 07:59:49.207178815 +0000 UTC m=+582.503751800" watchObservedRunningTime="2026-04-20 07:59:49.209386441 +0000 UTC m=+582.505959424" Apr 20 08:00:00.199327 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:00.199290 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-zp5mg" Apr 20 08:00:03.258987 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.258954 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk"] Apr 20 08:00:03.263809 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.263786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.268286 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.268266 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6dggc\"" Apr 20 08:00:03.281404 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.281382 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk"] Apr 20 08:00:03.337465 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.337436 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29cpk\" (UniqueName: \"kubernetes.io/projected/dbd359e5-b9cc-4e77-84d4-20b2777699e7-kube-api-access-29cpk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rlzkk\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.337632 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.337526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd359e5-b9cc-4e77-84d4-20b2777699e7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rlzkk\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.438885 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.438849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29cpk\" (UniqueName: \"kubernetes.io/projected/dbd359e5-b9cc-4e77-84d4-20b2777699e7-kube-api-access-29cpk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rlzkk\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.439064 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.438912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd359e5-b9cc-4e77-84d4-20b2777699e7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rlzkk\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.439313 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.439289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd359e5-b9cc-4e77-84d4-20b2777699e7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rlzkk\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.447482 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.447457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29cpk\" (UniqueName: \"kubernetes.io/projected/dbd359e5-b9cc-4e77-84d4-20b2777699e7-kube-api-access-29cpk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rlzkk\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.573810 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.573775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:03.703187 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:03.703160 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk"] Apr 20 08:00:03.705766 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:00:03.705738 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd359e5_b9cc_4e77_84d4_20b2777699e7.slice/crio-9b3e64d7c75b07dde5e77e40ee4e24b02d010e58d65670f894f9655deaec75d1 WatchSource:0}: Error finding container 9b3e64d7c75b07dde5e77e40ee4e24b02d010e58d65670f894f9655deaec75d1: Status 404 returned error can't find the container with id 9b3e64d7c75b07dde5e77e40ee4e24b02d010e58d65670f894f9655deaec75d1 Apr 20 08:00:04.248481 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:04.248445 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" event={"ID":"dbd359e5-b9cc-4e77-84d4-20b2777699e7","Type":"ContainerStarted","Data":"9b3e64d7c75b07dde5e77e40ee4e24b02d010e58d65670f894f9655deaec75d1"} Apr 20 08:00:07.223350 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:07.223309 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:00:07.223784 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:07.223398 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:00:09.269284 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:09.269247 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" event={"ID":"dbd359e5-b9cc-4e77-84d4-20b2777699e7","Type":"ContainerStarted","Data":"9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924"} Apr 20 08:00:09.269676 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:09.269516 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:00:09.284415 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:09.284370 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" podStartSLOduration=1.537789862 podStartE2EDuration="6.284357827s" podCreationTimestamp="2026-04-20 08:00:03 +0000 UTC" firstStartedPulling="2026-04-20 08:00:03.708663107 +0000 UTC m=+597.005236069" lastFinishedPulling="2026-04-20 08:00:08.455231067 +0000 UTC m=+601.751804034" observedRunningTime="2026-04-20 08:00:09.282881398 +0000 UTC m=+602.579454381" watchObservedRunningTime="2026-04-20 08:00:09.284357827 +0000 UTC m=+602.580930811" Apr 20 08:00:20.275115 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:00:20.275034 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:02:07.680918 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.680880 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7ccccb6d4f-lbgk6"] Apr 20 08:02:07.684629 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.684605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:07.687008 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.686984 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 08:02:07.687917 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.687716 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 08:02:07.687917 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.687775 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-wlc48\"" Apr 20 08:02:07.692824 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.692791 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7ccccb6d4f-lbgk6"] Apr 20 08:02:07.732603 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.732562 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpltt\" (UniqueName: \"kubernetes.io/projected/39c03c8b-327c-4bd1-86ef-b8a5b1039200-kube-api-access-vpltt\") pod \"maas-api-7ccccb6d4f-lbgk6\" (UID: \"39c03c8b-327c-4bd1-86ef-b8a5b1039200\") " pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:07.732814 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.732627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/39c03c8b-327c-4bd1-86ef-b8a5b1039200-maas-api-tls\") pod \"maas-api-7ccccb6d4f-lbgk6\" (UID: \"39c03c8b-327c-4bd1-86ef-b8a5b1039200\") " pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:07.833130 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.833091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/39c03c8b-327c-4bd1-86ef-b8a5b1039200-maas-api-tls\") pod \"maas-api-7ccccb6d4f-lbgk6\" (UID: \"39c03c8b-327c-4bd1-86ef-b8a5b1039200\") " pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:07.833321 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.833163 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpltt\" (UniqueName: \"kubernetes.io/projected/39c03c8b-327c-4bd1-86ef-b8a5b1039200-kube-api-access-vpltt\") pod \"maas-api-7ccccb6d4f-lbgk6\" (UID: \"39c03c8b-327c-4bd1-86ef-b8a5b1039200\") " pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:07.835633 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.835611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/39c03c8b-327c-4bd1-86ef-b8a5b1039200-maas-api-tls\") pod \"maas-api-7ccccb6d4f-lbgk6\" (UID: \"39c03c8b-327c-4bd1-86ef-b8a5b1039200\") " pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:07.840124 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:07.840099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpltt\" (UniqueName: \"kubernetes.io/projected/39c03c8b-327c-4bd1-86ef-b8a5b1039200-kube-api-access-vpltt\") pod \"maas-api-7ccccb6d4f-lbgk6\" (UID: \"39c03c8b-327c-4bd1-86ef-b8a5b1039200\") " pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:08.004458 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:08.004362 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:08.133401 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:08.133373 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7ccccb6d4f-lbgk6"] Apr 20 08:02:08.135278 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:02:08.135249 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c03c8b_327c_4bd1_86ef_b8a5b1039200.slice/crio-819253bd769f51218b6112606fae830603b3640342fbc736b1a3534c2a901879 WatchSource:0}: Error finding container 819253bd769f51218b6112606fae830603b3640342fbc736b1a3534c2a901879: Status 404 returned error can't find the container with id 819253bd769f51218b6112606fae830603b3640342fbc736b1a3534c2a901879 Apr 20 08:02:08.136452 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:08.136434 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:02:08.705245 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:08.705191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" event={"ID":"39c03c8b-327c-4bd1-86ef-b8a5b1039200","Type":"ContainerStarted","Data":"819253bd769f51218b6112606fae830603b3640342fbc736b1a3534c2a901879"} Apr 20 08:02:10.714680 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:10.714596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" event={"ID":"39c03c8b-327c-4bd1-86ef-b8a5b1039200","Type":"ContainerStarted","Data":"956ad0f262efc8a9c374bc6c22ecbdb89a1fcc60a5f3bdc783067facaf367b03"} Apr 20 08:02:10.715035 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:10.714714 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:10.731438 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:10.731392 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" podStartSLOduration=1.464389687 podStartE2EDuration="3.731378334s" podCreationTimestamp="2026-04-20 08:02:07 +0000 UTC" firstStartedPulling="2026-04-20 08:02:08.136555527 +0000 UTC m=+721.433128489" lastFinishedPulling="2026-04-20 08:02:10.403544169 +0000 UTC m=+723.700117136" observedRunningTime="2026-04-20 08:02:10.728098386 +0000 UTC m=+724.024671362" watchObservedRunningTime="2026-04-20 08:02:10.731378334 +0000 UTC m=+724.027951344" Apr 20 08:02:16.493977 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.493943 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926"] Apr 20 08:02:16.497634 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.497610 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.499602 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.499580 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 08:02:16.499707 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.499580 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 08:02:16.500145 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.500126 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 08:02:16.500250 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.500185 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-5swfc\"" Apr 20 08:02:16.504965 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.504940 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926"] Apr 20 08:02:16.606515 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.606472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.606701 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.606547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.606701 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.606598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.606816 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.606700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14039460-1624-4f75-8d8f-59076df141e5-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.606816 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.606754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.606816 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.606782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gsw\" (UniqueName: \"kubernetes.io/projected/14039460-1624-4f75-8d8f-59076df141e5-kube-api-access-z8gsw\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.707930 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.707896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708089 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.707944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708089 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.707977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14039460-1624-4f75-8d8f-59076df141e5-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708089 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.707999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708089 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.708017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gsw\" (UniqueName: \"kubernetes.io/projected/14039460-1624-4f75-8d8f-59076df141e5-kube-api-access-z8gsw\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708089 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.708046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708424 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.708403 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708508 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.708471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.708508 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.708496 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.710201 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.710172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14039460-1624-4f75-8d8f-59076df141e5-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.710486 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.710469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14039460-1624-4f75-8d8f-59076df141e5-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.714812 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.714789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gsw\" (UniqueName: \"kubernetes.io/projected/14039460-1624-4f75-8d8f-59076df141e5-kube-api-access-z8gsw\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-tf926\" (UID: \"14039460-1624-4f75-8d8f-59076df141e5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.724568 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.724548 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7ccccb6d4f-lbgk6" Apr 20 08:02:16.808733 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.808704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:16.937462 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:16.937434 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926"] Apr 20 08:02:16.938618 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:02:16.938586 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14039460_1624_4f75_8d8f_59076df141e5.slice/crio-f7873852e3232e7f22699e6ac4771ce00787c82f89c507171127a8e8ca1ef884 WatchSource:0}: Error finding container f7873852e3232e7f22699e6ac4771ce00787c82f89c507171127a8e8ca1ef884: Status 404 returned error can't find the container with id f7873852e3232e7f22699e6ac4771ce00787c82f89c507171127a8e8ca1ef884 Apr 20 08:02:17.741819 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:17.741776 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" event={"ID":"14039460-1624-4f75-8d8f-59076df141e5","Type":"ContainerStarted","Data":"f7873852e3232e7f22699e6ac4771ce00787c82f89c507171127a8e8ca1ef884"} Apr 20 08:02:22.774408 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:22.774364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" event={"ID":"14039460-1624-4f75-8d8f-59076df141e5","Type":"ContainerStarted","Data":"1afa44cd7131940d9d314c1d6abc163b3ca0f746c8c624b1f91e5e1a0f20c068"} Apr 20 08:02:30.806515 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:30.806478 2569 generic.go:358] "Generic (PLEG): container finished" podID="14039460-1624-4f75-8d8f-59076df141e5" containerID="1afa44cd7131940d9d314c1d6abc163b3ca0f746c8c624b1f91e5e1a0f20c068" exitCode=0 Apr 20 08:02:30.806891 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:30.806524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" event={"ID":"14039460-1624-4f75-8d8f-59076df141e5","Type":"ContainerDied","Data":"1afa44cd7131940d9d314c1d6abc163b3ca0f746c8c624b1f91e5e1a0f20c068"} Apr 20 08:02:32.816653 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:32.816618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" event={"ID":"14039460-1624-4f75-8d8f-59076df141e5","Type":"ContainerStarted","Data":"58d6df55ac0b95355a66b0f9c42bc5af8305d5cd4c22205863495f6de0a8b558"} Apr 20 08:02:32.817038 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:32.816825 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:32.834010 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:32.833946 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" podStartSLOduration=1.697597984 podStartE2EDuration="16.833933238s" podCreationTimestamp="2026-04-20 08:02:16 +0000 UTC" firstStartedPulling="2026-04-20 08:02:16.940456713 +0000 UTC m=+730.237029678" lastFinishedPulling="2026-04-20 08:02:32.076791967 +0000 UTC m=+745.373364932" observedRunningTime="2026-04-20 08:02:32.831804279 +0000 UTC m=+746.128377264" watchObservedRunningTime="2026-04-20 08:02:32.833933238 +0000 UTC m=+746.130506221" Apr 20 08:02:43.833638 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:43.833600 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-tf926" Apr 20 08:02:45.194277 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.194230 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr"] Apr 20 08:02:45.352091 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.352051 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr"] Apr 20 08:02:45.353725 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.353701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.357754 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.357725 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 08:02:45.465297 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.465176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.465297 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.465271 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.465526 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.465302 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fkwv\" (UniqueName: \"kubernetes.io/projected/5ff0b80a-739c-4612-95ad-2419940077dc-kube-api-access-9fkwv\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.465526 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.465331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff0b80a-739c-4612-95ad-2419940077dc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.465526 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.465396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.465526 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.465456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566532 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566482 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566532 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fkwv\" (UniqueName: \"kubernetes.io/projected/5ff0b80a-739c-4612-95ad-2419940077dc-kube-api-access-9fkwv\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566768 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff0b80a-739c-4612-95ad-2419940077dc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566768 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566893 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566893 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.566992 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.566885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.567098 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.567076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.567167 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.567152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.569002 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.568972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ff0b80a-739c-4612-95ad-2419940077dc-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.569130 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.569112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff0b80a-739c-4612-95ad-2419940077dc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.573407 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.573384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fkwv\" (UniqueName: \"kubernetes.io/projected/5ff0b80a-739c-4612-95ad-2419940077dc-kube-api-access-9fkwv\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr\" (UID: \"5ff0b80a-739c-4612-95ad-2419940077dc\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.667866 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.667826 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:45.798990 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.798963 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr"] Apr 20 08:02:45.801516 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:02:45.801477 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff0b80a_739c_4612_95ad_2419940077dc.slice/crio-f62a95c6a7f6e0be090f0dd3b4dad806fcb0680352635f03a9a1c482293c2f55 WatchSource:0}: Error finding container f62a95c6a7f6e0be090f0dd3b4dad806fcb0680352635f03a9a1c482293c2f55: Status 404 returned error can't find the container with id f62a95c6a7f6e0be090f0dd3b4dad806fcb0680352635f03a9a1c482293c2f55 Apr 20 08:02:45.869281 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:45.869250 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" event={"ID":"5ff0b80a-739c-4612-95ad-2419940077dc","Type":"ContainerStarted","Data":"f62a95c6a7f6e0be090f0dd3b4dad806fcb0680352635f03a9a1c482293c2f55"} Apr 20 08:02:46.874680 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:46.874632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" event={"ID":"5ff0b80a-739c-4612-95ad-2419940077dc","Type":"ContainerStarted","Data":"96a4ba5ddf5672d14f4c827757565a6fdb868e8fa52f233f59dc822e6f73eddf"} Apr 20 08:02:50.692149 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:50.692113 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws"] Apr 20 08:02:50.879497 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:50.879466 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws"] Apr 20 08:02:50.879661 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:50.879584 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:50.881457 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:50.881432 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 08:02:51.016846 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.016811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.017009 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.016875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.017009 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.016898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.017009 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.016923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.017009 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.016958 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df696bed-9e0c-4a63-95b5-1bd7009a461b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.017149 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.017023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2frx\" (UniqueName: \"kubernetes.io/projected/df696bed-9e0c-4a63-95b5-1bd7009a461b-kube-api-access-k2frx\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118320 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118320 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118523 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118565 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df696bed-9e0c-4a63-95b5-1bd7009a461b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118599 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2frx\" (UniqueName: \"kubernetes.io/projected/df696bed-9e0c-4a63-95b5-1bd7009a461b-kube-api-access-k2frx\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118638 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118687 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118665 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.118958 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.119049 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.118962 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.121052 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.121035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/df696bed-9e0c-4a63-95b5-1bd7009a461b-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.121437 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.121419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df696bed-9e0c-4a63-95b5-1bd7009a461b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.125579 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.125557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2frx\" (UniqueName: \"kubernetes.io/projected/df696bed-9e0c-4a63-95b5-1bd7009a461b-kube-api-access-k2frx\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-28bws\" (UID: \"df696bed-9e0c-4a63-95b5-1bd7009a461b\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.190776 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.190730 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:51.327766 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.327735 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws"] Apr 20 08:02:51.329835 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:02:51.329801 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf696bed_9e0c_4a63_95b5_1bd7009a461b.slice/crio-19d4f51c2eb13f30c958e81addae3dfc447307d5e86a7e82fb3f9ddfa01b85b6 WatchSource:0}: Error finding container 19d4f51c2eb13f30c958e81addae3dfc447307d5e86a7e82fb3f9ddfa01b85b6: Status 404 returned error can't find the container with id 19d4f51c2eb13f30c958e81addae3dfc447307d5e86a7e82fb3f9ddfa01b85b6 Apr 20 08:02:51.900457 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.900419 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" event={"ID":"df696bed-9e0c-4a63-95b5-1bd7009a461b","Type":"ContainerStarted","Data":"8b91ce205b156f0c0a28380136b29d585aefcd4c051a423b90e320ee5edde911"} Apr 20 08:02:51.900457 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.900460 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" event={"ID":"df696bed-9e0c-4a63-95b5-1bd7009a461b","Type":"ContainerStarted","Data":"19d4f51c2eb13f30c958e81addae3dfc447307d5e86a7e82fb3f9ddfa01b85b6"} Apr 20 08:02:51.901875 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.901850 2569 generic.go:358] "Generic (PLEG): container finished" podID="5ff0b80a-739c-4612-95ad-2419940077dc" containerID="96a4ba5ddf5672d14f4c827757565a6fdb868e8fa52f233f59dc822e6f73eddf" exitCode=0 Apr 20 08:02:51.901986 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:51.901925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" event={"ID":"5ff0b80a-739c-4612-95ad-2419940077dc","Type":"ContainerDied","Data":"96a4ba5ddf5672d14f4c827757565a6fdb868e8fa52f233f59dc822e6f73eddf"} Apr 20 08:02:52.907541 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:52.907504 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" event={"ID":"5ff0b80a-739c-4612-95ad-2419940077dc","Type":"ContainerStarted","Data":"c6e2cd734cdb0650c26a56867f406df6dd266d03b87a54aef3b5b75cc8aafa1c"} Apr 20 08:02:52.908249 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:52.908221 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:02:52.924563 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:52.924522 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" podStartSLOduration=7.602419676 podStartE2EDuration="7.92450974s" podCreationTimestamp="2026-04-20 08:02:45 +0000 UTC" firstStartedPulling="2026-04-20 08:02:51.902480069 +0000 UTC m=+765.199053031" lastFinishedPulling="2026-04-20 08:02:52.224570128 +0000 UTC m=+765.521143095" observedRunningTime="2026-04-20 08:02:52.92255884 +0000 UTC m=+766.219131824" watchObservedRunningTime="2026-04-20 08:02:52.92450974 +0000 UTC m=+766.221082723" Apr 20 08:02:56.924789 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:56.924754 2569 generic.go:358] "Generic (PLEG): container finished" podID="df696bed-9e0c-4a63-95b5-1bd7009a461b" containerID="8b91ce205b156f0c0a28380136b29d585aefcd4c051a423b90e320ee5edde911" exitCode=0 Apr 20 08:02:56.925149 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:56.924830 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" event={"ID":"df696bed-9e0c-4a63-95b5-1bd7009a461b","Type":"ContainerDied","Data":"8b91ce205b156f0c0a28380136b29d585aefcd4c051a423b90e320ee5edde911"} Apr 20 08:02:57.929909 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:57.929873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" event={"ID":"df696bed-9e0c-4a63-95b5-1bd7009a461b","Type":"ContainerStarted","Data":"f708cc067d8c26c614d6ed36208fd1c7f665c1057e8bc22efbcfdfc51184567f"} Apr 20 08:02:57.930314 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:57.930089 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:02:57.947363 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:02:57.947314 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" podStartSLOduration=7.703942473 podStartE2EDuration="7.947297584s" podCreationTimestamp="2026-04-20 08:02:50 +0000 UTC" firstStartedPulling="2026-04-20 08:02:56.925465711 +0000 UTC m=+770.222038672" lastFinishedPulling="2026-04-20 08:02:57.168820808 +0000 UTC m=+770.465393783" observedRunningTime="2026-04-20 08:02:57.944609899 +0000 UTC m=+771.241182897" watchObservedRunningTime="2026-04-20 08:02:57.947297584 +0000 UTC m=+771.243870571" Apr 20 08:03:03.924942 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:03:03.924911 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr" Apr 20 08:03:08.948778 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:03:08.948746 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-28bws" Apr 20 08:05:07.253363 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:05:07.253333 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:05:07.256447 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:05:07.256421 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:10:07.289223 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:10:07.289181 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:10:07.292085 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:10:07.292066 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:15:07.318985 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:07.318958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:15:07.324134 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:07.324112 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:15:35.912236 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:35.912129 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk"] Apr 20 08:15:35.912801 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:35.912458 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" podUID="dbd359e5-b9cc-4e77-84d4-20b2777699e7" containerName="manager" containerID="cri-o://9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924" gracePeriod=10 Apr 20 08:15:37.668703 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.668676 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:15:37.751292 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.751263 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd359e5-b9cc-4e77-84d4-20b2777699e7-extensions-socket-volume\") pod \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " Apr 20 08:15:37.751428 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.751300 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29cpk\" (UniqueName: \"kubernetes.io/projected/dbd359e5-b9cc-4e77-84d4-20b2777699e7-kube-api-access-29cpk\") pod \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\" (UID: \"dbd359e5-b9cc-4e77-84d4-20b2777699e7\") " Apr 20 08:15:37.751648 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.751624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbd359e5-b9cc-4e77-84d4-20b2777699e7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "dbd359e5-b9cc-4e77-84d4-20b2777699e7" (UID: "dbd359e5-b9cc-4e77-84d4-20b2777699e7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 08:15:37.753345 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.753317 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd359e5-b9cc-4e77-84d4-20b2777699e7-kube-api-access-29cpk" (OuterVolumeSpecName: "kube-api-access-29cpk") pod "dbd359e5-b9cc-4e77-84d4-20b2777699e7" (UID: "dbd359e5-b9cc-4e77-84d4-20b2777699e7"). InnerVolumeSpecName "kube-api-access-29cpk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:15:37.817014 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.816923 2569 generic.go:358] "Generic (PLEG): container finished" podID="dbd359e5-b9cc-4e77-84d4-20b2777699e7" containerID="9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924" exitCode=0 Apr 20 08:15:37.817014 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.816979 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" Apr 20 08:15:37.817014 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.817002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" event={"ID":"dbd359e5-b9cc-4e77-84d4-20b2777699e7","Type":"ContainerDied","Data":"9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924"} Apr 20 08:15:37.817242 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.817045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk" event={"ID":"dbd359e5-b9cc-4e77-84d4-20b2777699e7","Type":"ContainerDied","Data":"9b3e64d7c75b07dde5e77e40ee4e24b02d010e58d65670f894f9655deaec75d1"} Apr 20 08:15:37.817242 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.817061 2569 scope.go:117] "RemoveContainer" containerID="9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924" Apr 20 08:15:37.827456 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.827434 2569 scope.go:117] "RemoveContainer" containerID="9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924" Apr 20 08:15:37.827732 ip-10-0-138-4 kubenswrapper[2569]: E0420 08:15:37.827708 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924\": container with ID starting with 9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924 not found: ID does not exist" containerID="9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924" Apr 20 08:15:37.827785 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.827741 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924"} err="failed to get container status \"9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924\": rpc error: code = NotFound desc = could not find container \"9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924\": container with ID starting with 9722dcdda0256b05f0cdea8062d876d07fe824a15a8ba6b1456a1c810a077924 not found: ID does not exist" Apr 20 08:15:37.837630 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.837607 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk"] Apr 20 08:15:37.841020 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.841001 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rlzkk"] Apr 20 08:15:37.852719 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.852697 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd359e5-b9cc-4e77-84d4-20b2777699e7-extensions-socket-volume\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 08:15:37.852719 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:37.852717 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29cpk\" (UniqueName: \"kubernetes.io/projected/dbd359e5-b9cc-4e77-84d4-20b2777699e7-kube-api-access-29cpk\") on node \"ip-10-0-138-4.ec2.internal\" DevicePath \"\"" Apr 20 08:15:39.280781 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:15:39.280748 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd359e5-b9cc-4e77-84d4-20b2777699e7" path="/var/lib/kubelet/pods/dbd359e5-b9cc-4e77-84d4-20b2777699e7/volumes" Apr 20 08:16:44.266315 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.266264 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz"] Apr 20 08:16:44.266800 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.266636 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbd359e5-b9cc-4e77-84d4-20b2777699e7" containerName="manager" Apr 20 08:16:44.266800 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.266649 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd359e5-b9cc-4e77-84d4-20b2777699e7" containerName="manager" Apr 20 08:16:44.266800 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.266721 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbd359e5-b9cc-4e77-84d4-20b2777699e7" containerName="manager" Apr 20 08:16:44.269523 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.269506 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.271476 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.271456 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6dggc\"" Apr 20 08:16:44.278742 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.278719 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz"] Apr 20 08:16:44.402574 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.402543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/83ea1103-0f43-450e-adb8-5bb334fdac0f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cfgmz\" (UID: \"83ea1103-0f43-450e-adb8-5bb334fdac0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.402733 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.402591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd56g\" (UniqueName: \"kubernetes.io/projected/83ea1103-0f43-450e-adb8-5bb334fdac0f-kube-api-access-sd56g\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cfgmz\" (UID: \"83ea1103-0f43-450e-adb8-5bb334fdac0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.503577 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.503546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/83ea1103-0f43-450e-adb8-5bb334fdac0f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cfgmz\" (UID: \"83ea1103-0f43-450e-adb8-5bb334fdac0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.503714 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.503596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd56g\" (UniqueName: \"kubernetes.io/projected/83ea1103-0f43-450e-adb8-5bb334fdac0f-kube-api-access-sd56g\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cfgmz\" (UID: \"83ea1103-0f43-450e-adb8-5bb334fdac0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.503920 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.503899 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/83ea1103-0f43-450e-adb8-5bb334fdac0f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cfgmz\" (UID: \"83ea1103-0f43-450e-adb8-5bb334fdac0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.516394 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.516338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd56g\" (UniqueName: \"kubernetes.io/projected/83ea1103-0f43-450e-adb8-5bb334fdac0f-kube-api-access-sd56g\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cfgmz\" (UID: \"83ea1103-0f43-450e-adb8-5bb334fdac0f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.580868 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.580835 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:44.707946 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.707920 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz"] Apr 20 08:16:44.709458 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:16:44.709428 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ea1103_0f43_450e_adb8_5bb334fdac0f.slice/crio-864fd357725ffc7eab610bce0d0e421a746b76e43cd4fa80ac0684f257a8cfb2 WatchSource:0}: Error finding container 864fd357725ffc7eab610bce0d0e421a746b76e43cd4fa80ac0684f257a8cfb2: Status 404 returned error can't find the container with id 864fd357725ffc7eab610bce0d0e421a746b76e43cd4fa80ac0684f257a8cfb2 Apr 20 08:16:44.712276 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:44.712259 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:16:45.069123 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:45.069043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" event={"ID":"83ea1103-0f43-450e-adb8-5bb334fdac0f","Type":"ContainerStarted","Data":"eb570ced3ca7b1b93df386e106fdc328dd9704a21d27b5b53ae16c15f741cdf7"} Apr 20 08:16:45.069123 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:45.069082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" event={"ID":"83ea1103-0f43-450e-adb8-5bb334fdac0f","Type":"ContainerStarted","Data":"864fd357725ffc7eab610bce0d0e421a746b76e43cd4fa80ac0684f257a8cfb2"} Apr 20 08:16:45.069123 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:45.069095 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:16:45.085611 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:45.085556 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" podStartSLOduration=1.08553862 podStartE2EDuration="1.08553862s" podCreationTimestamp="2026-04-20 08:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:16:45.084141006 +0000 UTC m=+1598.380713989" watchObservedRunningTime="2026-04-20 08:16:45.08553862 +0000 UTC m=+1598.382111606" Apr 20 08:16:56.075023 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:16:56.074991 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cfgmz" Apr 20 08:20:07.349144 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:20:07.349114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:20:07.355071 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:20:07.355047 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:25:07.380845 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:25:07.380818 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:25:07.387157 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:25:07.387138 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:27:04.261975 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:04.261939 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-gnct7_efe4b4d1-5b5f-4188-8c66-d364d4c15d89/manager/0.log" Apr 20 08:27:04.383713 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:04.383680 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7ccccb6d4f-lbgk6_39c03c8b-327c-4bd1-86ef-b8a5b1039200/maas-api/0.log" Apr 20 08:27:04.653079 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:04.653002 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-8zsdw_caddfbde-b868-43f4-bba6-1b6166d52d42/manager/1.log" Apr 20 08:27:05.041035 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:05.041003 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-687c889b9-q6blb_347d9aac-7ac6-44f2-b44e-dd9b37d26eab/manager/0.log" Apr 20 08:27:06.002717 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.002687 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t_dfb1b096-0c6b-4634-a22b-de3317e5ca49/util/0.log" Apr 20 08:27:06.009446 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.009411 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t_dfb1b096-0c6b-4634-a22b-de3317e5ca49/pull/0.log" Apr 20 08:27:06.015892 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.015869 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t_dfb1b096-0c6b-4634-a22b-de3317e5ca49/extract/0.log" Apr 20 08:27:06.128510 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.128475 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc_24a008ba-c82d-48df-9a39-aa2a17fe63f4/util/0.log" Apr 20 08:27:06.135452 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.135425 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc_24a008ba-c82d-48df-9a39-aa2a17fe63f4/pull/0.log" Apr 20 08:27:06.142046 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.142028 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc_24a008ba-c82d-48df-9a39-aa2a17fe63f4/extract/0.log" Apr 20 08:27:06.266175 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.266140 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm_a0f8ffbf-0872-4075-b182-a5ad51501cfd/util/0.log" Apr 20 08:27:06.272562 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.272540 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm_a0f8ffbf-0872-4075-b182-a5ad51501cfd/pull/0.log" Apr 20 08:27:06.278725 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.278705 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm_a0f8ffbf-0872-4075-b182-a5ad51501cfd/extract/0.log" Apr 20 08:27:06.396089 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.396061 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz_e03b4b3e-46c1-4df2-ad89-b51387e90d5f/util/0.log" Apr 20 08:27:06.403279 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.403249 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz_e03b4b3e-46c1-4df2-ad89-b51387e90d5f/pull/0.log" Apr 20 08:27:06.410124 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.410100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz_e03b4b3e-46c1-4df2-ad89-b51387e90d5f/extract/0.log" Apr 20 08:27:06.666531 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:06.666441 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-zp5mg_21ee5ad3-fc71-4930-9b45-2317ed0b800d/manager/0.log" Apr 20 08:27:07.174050 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:07.174021 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-cfgmz_83ea1103-0f43-450e-adb8-5bb334fdac0f/manager/0.log" Apr 20 08:27:07.930842 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:07.930809 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-xprjb_24c37dbe-4cd8-4313-adfd-4d04f39bd0d6/discovery/0.log" Apr 20 08:27:08.055289 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:08.055261 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9784c649-4nbqg_fb2fe040-660d-4160-92fd-18d18497d727/kube-auth-proxy/0.log" Apr 20 08:27:08.811243 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:08.811193 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-28bws_df696bed-9e0c-4a63-95b5-1bd7009a461b/storage-initializer/0.log" Apr 20 08:27:08.818106 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:08.818084 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-28bws_df696bed-9e0c-4a63-95b5-1bd7009a461b/main/0.log" Apr 20 08:27:09.347160 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:09.347132 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr_5ff0b80a-739c-4612-95ad-2419940077dc/main/0.log" Apr 20 08:27:09.358482 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:09.358459 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-bgdlr_5ff0b80a-739c-4612-95ad-2419940077dc/storage-initializer/0.log" Apr 20 08:27:09.476528 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:09.476501 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-tf926_14039460-1624-4f75-8d8f-59076df141e5/storage-initializer/0.log" Apr 20 08:27:09.488953 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:09.488921 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-tf926_14039460-1624-4f75-8d8f-59076df141e5/main/0.log" Apr 20 08:27:16.909085 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:16.909052 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-r4n6p_d06eb14c-741b-46bf-aada-fd390434ddfd/global-pull-secret-syncer/0.log" Apr 20 08:27:17.003466 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:17.003435 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s2r4p_75f0e0ec-e11d-4c52-b6bd-ec1da8086f15/konnectivity-agent/0.log" Apr 20 08:27:17.066955 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:17.066923 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-4.ec2.internal_53a487abe77507ab89cb5cf1017a5b1f/haproxy/0.log" Apr 20 08:27:20.239168 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.239090 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t_dfb1b096-0c6b-4634-a22b-de3317e5ca49/extract/0.log" Apr 20 08:27:20.257816 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.257790 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t_dfb1b096-0c6b-4634-a22b-de3317e5ca49/util/0.log" Apr 20 08:27:20.281515 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.281483 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759txp6t_dfb1b096-0c6b-4634-a22b-de3317e5ca49/pull/0.log" Apr 20 08:27:20.325001 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.324977 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc_24a008ba-c82d-48df-9a39-aa2a17fe63f4/extract/0.log" Apr 20 08:27:20.348969 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.348944 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc_24a008ba-c82d-48df-9a39-aa2a17fe63f4/util/0.log" Apr 20 08:27:20.371484 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.371461 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qsnvc_24a008ba-c82d-48df-9a39-aa2a17fe63f4/pull/0.log" Apr 20 08:27:20.402184 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.402156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm_a0f8ffbf-0872-4075-b182-a5ad51501cfd/extract/0.log" Apr 20 08:27:20.429643 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.429621 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm_a0f8ffbf-0872-4075-b182-a5ad51501cfd/util/0.log" Apr 20 08:27:20.455433 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.455410 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73224zm_a0f8ffbf-0872-4075-b182-a5ad51501cfd/pull/0.log" Apr 20 08:27:20.482304 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.482283 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz_e03b4b3e-46c1-4df2-ad89-b51387e90d5f/extract/0.log" Apr 20 08:27:20.523033 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.523001 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz_e03b4b3e-46c1-4df2-ad89-b51387e90d5f/util/0.log" Apr 20 08:27:20.555838 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.555819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1bmwpz_e03b4b3e-46c1-4df2-ad89-b51387e90d5f/pull/0.log" Apr 20 08:27:20.831967 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:20.831894 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-zp5mg_21ee5ad3-fc71-4930-9b45-2317ed0b800d/manager/0.log" Apr 20 08:27:21.003653 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:21.003622 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-cfgmz_83ea1103-0f43-450e-adb8-5bb334fdac0f/manager/0.log" Apr 20 08:27:22.585063 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.585034 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kwntf_7dacd9e0-2a6d-4b57-8ff9-7f04940233b2/kube-state-metrics/0.log" Apr 20 08:27:22.601768 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.601737 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kwntf_7dacd9e0-2a6d-4b57-8ff9-7f04940233b2/kube-rbac-proxy-main/0.log" Apr 20 08:27:22.622359 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.622336 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kwntf_7dacd9e0-2a6d-4b57-8ff9-7f04940233b2/kube-rbac-proxy-self/0.log" Apr 20 08:27:22.646607 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.646586 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5bc96c64f4-dxtfp_74f750d2-e3e6-453d-ba47-5f700e14b402/metrics-server/0.log" Apr 20 08:27:22.857270 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.857175 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qjwjd_d1a29c4b-4ce9-483b-9569-26d9b1d19d6d/node-exporter/0.log" Apr 20 08:27:22.879579 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.879555 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qjwjd_d1a29c4b-4ce9-483b-9569-26d9b1d19d6d/kube-rbac-proxy/0.log" Apr 20 08:27:22.898925 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.898903 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qjwjd_d1a29c4b-4ce9-483b-9569-26d9b1d19d6d/init-textfile/0.log" Apr 20 08:27:22.938416 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.938396 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r9288_590627fc-8f17-4708-806d-6d1aaa587b47/kube-rbac-proxy-main/0.log" Apr 20 08:27:22.964568 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.964543 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r9288_590627fc-8f17-4708-806d-6d1aaa587b47/kube-rbac-proxy-self/0.log" Apr 20 08:27:22.988566 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:22.988535 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r9288_590627fc-8f17-4708-806d-6d1aaa587b47/openshift-state-metrics/0.log" Apr 20 08:27:25.989002 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:25.988971 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275"] Apr 20 08:27:25.992584 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:25.992569 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:25.994542 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:25.994518 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mvtfk\"/\"kube-root-ca.crt\"" Apr 20 08:27:25.994910 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:25.994896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mvtfk\"/\"default-dockercfg-qxwb7\"" Apr 20 08:27:25.995182 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:25.995169 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mvtfk\"/\"openshift-service-ca.crt\"" Apr 20 08:27:26.000474 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.000446 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275"] Apr 20 08:27:26.136709 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.136657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-proc\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.136709 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.136706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2s4l\" (UniqueName: \"kubernetes.io/projected/78541c56-0356-40dc-bef5-39bdf3866d02-kube-api-access-s2s4l\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.136999 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.136730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-lib-modules\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.136999 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.136750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-sys\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.136999 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.136836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-podres\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237780 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-podres\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-proc\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2s4l\" (UniqueName: \"kubernetes.io/projected/78541c56-0356-40dc-bef5-39bdf3866d02-kube-api-access-s2s4l\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-lib-modules\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-sys\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-podres\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-proc\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.237945 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-sys\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.238247 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.237999 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78541c56-0356-40dc-bef5-39bdf3866d02-lib-modules\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.245510 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.245453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2s4l\" (UniqueName: \"kubernetes.io/projected/78541c56-0356-40dc-bef5-39bdf3866d02-kube-api-access-s2s4l\") pod \"perf-node-gather-daemonset-l4275\" (UID: \"78541c56-0356-40dc-bef5-39bdf3866d02\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.303612 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.303586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:26.425920 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.425891 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275"] Apr 20 08:27:26.427464 ip-10-0-138-4 kubenswrapper[2569]: W0420 08:27:26.427435 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod78541c56_0356_40dc_bef5_39bdf3866d02.slice/crio-5a32c85fb4114f96f42814986fd89f6c516b479fd72d7687eed255d86337f92c WatchSource:0}: Error finding container 5a32c85fb4114f96f42814986fd89f6c516b479fd72d7687eed255d86337f92c: Status 404 returned error can't find the container with id 5a32c85fb4114f96f42814986fd89f6c516b479fd72d7687eed255d86337f92c Apr 20 08:27:26.428931 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.428913 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:27:26.501122 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:26.501069 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" event={"ID":"78541c56-0356-40dc-bef5-39bdf3866d02","Type":"ContainerStarted","Data":"5a32c85fb4114f96f42814986fd89f6c516b479fd72d7687eed255d86337f92c"} Apr 20 08:27:27.007477 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.007447 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6pmzx_19f51839-0090-41a8-b3ef-00a1ee0ca874/dns/0.log" Apr 20 08:27:27.027525 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.027498 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6pmzx_19f51839-0090-41a8-b3ef-00a1ee0ca874/kube-rbac-proxy/0.log" Apr 20 08:27:27.164579 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.164548 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jd8vj_8c2b7af6-350a-4223-8075-1a8760e67c96/dns-node-resolver/0.log" Apr 20 08:27:27.506536 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.506502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" event={"ID":"78541c56-0356-40dc-bef5-39bdf3866d02","Type":"ContainerStarted","Data":"118e5b2adb130a3a5efe78c3616aae268ccb01792ecab9a7595a3a27ad936713"} Apr 20 08:27:27.506709 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.506648 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:27.522056 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.522015 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" podStartSLOduration=2.522002643 podStartE2EDuration="2.522002643s" podCreationTimestamp="2026-04-20 08:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:27:27.520247826 +0000 UTC m=+2240.816820807" watchObservedRunningTime="2026-04-20 08:27:27.522002643 +0000 UTC m=+2240.818575653" Apr 20 08:27:27.700506 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:27.700474 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-khs8v_f5d50f48-c1cd-490d-8f78-48a66378ab3a/node-ca/0.log" Apr 20 08:27:28.548049 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:28.548020 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-xprjb_24c37dbe-4cd8-4313-adfd-4d04f39bd0d6/discovery/0.log" Apr 20 08:27:28.566668 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:28.566641 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9784c649-4nbqg_fb2fe040-660d-4160-92fd-18d18497d727/kube-auth-proxy/0.log" Apr 20 08:27:29.141771 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:29.141745 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gg5hx_5ba51e49-9c17-47c6-813d-05581eece4d6/serve-healthcheck-canary/0.log" Apr 20 08:27:29.712935 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:29.712909 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-znzxf_deea7fc6-8ab1-4fa5-bc3e-7464e89e4318/kube-rbac-proxy/0.log" Apr 20 08:27:29.731709 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:29.731681 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-znzxf_deea7fc6-8ab1-4fa5-bc3e-7464e89e4318/exporter/0.log" Apr 20 08:27:29.749539 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:29.749514 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-znzxf_deea7fc6-8ab1-4fa5-bc3e-7464e89e4318/extractor/0.log" Apr 20 08:27:31.671353 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:31.671325 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-gnct7_efe4b4d1-5b5f-4188-8c66-d364d4c15d89/manager/0.log" Apr 20 08:27:31.700187 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:31.700159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7ccccb6d4f-lbgk6_39c03c8b-327c-4bd1-86ef-b8a5b1039200/maas-api/0.log" Apr 20 08:27:31.774223 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:31.774175 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-8zsdw_caddfbde-b868-43f4-bba6-1b6166d52d42/manager/0.log" Apr 20 08:27:31.785190 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:31.785167 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-8zsdw_caddfbde-b868-43f4-bba6-1b6166d52d42/manager/1.log" Apr 20 08:27:31.874631 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:31.874595 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-687c889b9-q6blb_347d9aac-7ac6-44f2-b44e-dd9b37d26eab/manager/0.log" Apr 20 08:27:33.519196 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:33.519164 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-l4275" Apr 20 08:27:38.723773 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.723742 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/kube-multus-additional-cni-plugins/0.log" Apr 20 08:27:38.752270 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.752232 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/egress-router-binary-copy/0.log" Apr 20 08:27:38.773537 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.773516 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/cni-plugins/0.log" Apr 20 08:27:38.791750 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.791723 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/bond-cni-plugin/0.log" Apr 20 08:27:38.809178 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.809160 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/routeoverride-cni/0.log" Apr 20 08:27:38.827168 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.827140 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/whereabouts-cni-bincopy/0.log" Apr 20 08:27:38.845061 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:38.845043 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gs6zc_ea6db407-9937-4b0f-84e4-91f5c10786a5/whereabouts-cni/0.log" Apr 20 08:27:39.137299 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:39.137266 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qh227_a87d5cb5-84a7-4b46-9c01-785c30aedcbf/kube-multus/0.log" Apr 20 08:27:39.257607 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:39.257572 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m5qfv_0e96090b-285a-4c1b-98c7-6793626b3969/network-metrics-daemon/0.log" Apr 20 08:27:39.275869 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:39.275836 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m5qfv_0e96090b-285a-4c1b-98c7-6793626b3969/kube-rbac-proxy/0.log" Apr 20 08:27:40.581168 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.581138 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-controller/0.log" Apr 20 08:27:40.599386 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.599361 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/0.log" Apr 20 08:27:40.609593 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.609568 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovn-acl-logging/1.log" Apr 20 08:27:40.628067 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.628047 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/kube-rbac-proxy-node/0.log" Apr 20 08:27:40.648282 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.648264 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 08:27:40.665256 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.665198 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/northd/0.log" Apr 20 08:27:40.682840 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.682814 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/nbdb/0.log" Apr 20 08:27:40.724642 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.724619 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/sbdb/0.log" Apr 20 08:27:40.821427 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:40.821385 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqpbd_872f16e1-a280-4e38-b34a-f24ffef351d3/ovnkube-controller/0.log" Apr 20 08:27:41.895871 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:41.895842 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vmv56_0a4dfd92-0a59-4e3f-bb86-c3a74ffec631/network-check-target-container/0.log" Apr 20 08:27:42.951427 ip-10-0-138-4 kubenswrapper[2569]: I0420 08:27:42.951399 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s26s7_d7076af5-74cf-4fa6-ac90-de8f6cc674e4/iptables-alerter/0.log"