Apr 21 01:47:38.888802 ip-10-0-141-35 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 01:47:38.888814 ip-10-0-141-35 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 01:47:38.888821 ip-10-0-141-35 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 01:47:38.889031 ip-10-0-141-35 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 01:47:48.984021 ip-10-0-141-35 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 01:47:48.984036 ip-10-0-141-35 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f8f5f6f533664910a50e4e225d7115a5 -- Apr 21 01:50:20.111650 ip-10-0-141-35 systemd[1]: Starting Kubernetes Kubelet... Apr 21 01:50:20.534485 ip-10-0-141-35 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:50:20.534485 ip-10-0-141-35 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 01:50:20.534485 ip-10-0-141-35 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:50:20.534485 ip-10-0-141-35 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 01:50:20.534485 ip-10-0-141-35 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:50:20.537051 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.536954 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 01:50:20.539267 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539245 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:20.539267 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539262 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:20.539267 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539267 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:20.539267 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539272 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:20.539267 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539276 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539289 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539293 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539297 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539318 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539322 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539326 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539330 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539334 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539338 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539342 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539346 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539350 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539353 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539357 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539361 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539365 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539368 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539372 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539377 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:20.539577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539381 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539385 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539388 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539392 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539396 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539400 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539403 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539409 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539413 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539417 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539421 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539426 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539430 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539435 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539440 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539444 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539449 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539454 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539458 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539469 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:20.540462 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539474 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539478 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539482 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539487 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539491 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539496 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539501 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539505 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539509 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539513 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539517 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539521 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539526 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539530 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539542 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539548 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539553 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539558 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539562 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:20.541144 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539567 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539572 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539576 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539583 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539589 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539596 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539603 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539609 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539614 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539618 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539623 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539628 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539633 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539637 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539641 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539645 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539648 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539652 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539657 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:20.541654 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539661 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539666 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539670 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.539674 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540320 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540329 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540333 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540338 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540341 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540346 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540350 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540354 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540360 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540364 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540369 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540373 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540377 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540381 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540386 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540390 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:20.542229 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540395 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540399 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540403 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540407 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540412 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540417 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540421 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540425 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540433 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540439 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540444 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540449 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540454 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540458 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540462 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540466 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540471 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540475 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540479 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:20.543051 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540483 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540489 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540494 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540498 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540503 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540508 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540512 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540516 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540522 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540526 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540530 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540534 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540538 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540543 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540547 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540552 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540556 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540560 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540564 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:20.543820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540568 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540572 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540575 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540580 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540584 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540589 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540593 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540597 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540601 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540606 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540610 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540614 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540617 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540621 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540625 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540629 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540633 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540638 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540642 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540647 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:20.544410 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540651 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540657 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540663 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540668 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540672 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540677 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540681 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540685 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540689 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540694 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540697 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.540701 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.541985 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542002 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542011 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542018 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542025 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542031 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542039 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542046 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542052 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 01:50:20.544910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542057 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542062 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542067 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542072 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542078 2568 flags.go:64] FLAG: --cgroup-root="" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542083 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542088 2568 flags.go:64] FLAG: --client-ca-file="" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542093 2568 flags.go:64] FLAG: --cloud-config="" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542097 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542103 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542110 2568 flags.go:64] FLAG: --cluster-domain="" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542115 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542120 2568 flags.go:64] FLAG: --config-dir="" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542125 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542132 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542139 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542144 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542149 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542154 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542159 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542164 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542169 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542175 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542180 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542187 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 01:50:20.545452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542192 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542196 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542201 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542206 2568 flags.go:64] FLAG: --enable-server="true" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542211 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542217 2568 flags.go:64] FLAG: --event-burst="100" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542222 2568 flags.go:64] FLAG: --event-qps="50" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542227 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542233 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542238 2568 flags.go:64] FLAG: --eviction-hard="" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542244 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542249 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542254 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542259 2568 flags.go:64] FLAG: --eviction-soft="" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542264 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542269 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542274 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542279 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542284 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542289 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542295 2568 flags.go:64] FLAG: --feature-gates="" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542317 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542324 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542329 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542334 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542340 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 21 01:50:20.546059 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542345 2568 flags.go:64] FLAG: --help="false" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542350 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542355 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542360 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542365 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542371 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542377 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542382 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542387 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542391 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542396 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542401 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542407 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542412 2568 flags.go:64] FLAG: --kube-reserved="" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542417 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542421 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542427 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542431 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542436 2568 flags.go:64] FLAG: --lock-file="" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542441 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542446 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542451 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542460 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 01:50:20.546748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542465 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542470 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542475 2568 flags.go:64] FLAG: --logging-format="text" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542480 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542485 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542490 2568 flags.go:64] FLAG: --manifest-url="" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542496 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542503 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542508 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542515 2568 flags.go:64] FLAG: --max-pods="110" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542520 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542525 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542529 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542534 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542540 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542545 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542549 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542562 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542567 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542572 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542578 2568 flags.go:64] FLAG: --pod-cidr="" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542582 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542591 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542596 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 01:50:20.547328 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542601 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542606 2568 flags.go:64] FLAG: --port="10250" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542611 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542616 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a3477b3266b01165" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542626 2568 flags.go:64] FLAG: --qos-reserved="" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542631 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542636 2568 flags.go:64] FLAG: --register-node="true" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542641 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542646 2568 flags.go:64] FLAG: --register-with-taints="" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542652 2568 flags.go:64] FLAG: --registry-burst="10" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542658 2568 flags.go:64] FLAG: --registry-qps="5" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542663 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542668 2568 flags.go:64] FLAG: --reserved-memory="" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542674 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542679 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542685 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542690 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542696 2568 flags.go:64] FLAG: --runonce="false" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542700 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542705 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542711 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542715 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542720 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542725 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542730 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542735 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 01:50:20.547949 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542740 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542744 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542749 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542754 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542758 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542764 2568 flags.go:64] FLAG: --system-cgroups="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542769 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542778 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542783 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542787 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542796 2568 flags.go:64] FLAG: --tls-min-version="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542801 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542805 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542810 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542815 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542820 2568 flags.go:64] FLAG: --v="2" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542829 2568 flags.go:64] FLAG: --version="false" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542835 2568 flags.go:64] FLAG: --vmodule="" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542843 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.542848 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.543991 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544008 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544015 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544021 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:20.548609 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544026 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544031 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544114 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544147 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544245 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544249 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544253 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544256 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544259 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544263 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544266 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544269 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544272 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544275 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544280 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544285 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544288 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544291 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544293 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544297 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:20.549202 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544300 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544303 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544321 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544324 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544327 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544330 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544333 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544335 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544338 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544341 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544343 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544346 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544349 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544353 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544356 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544359 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544362 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544365 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544368 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544370 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:20.549739 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544373 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544376 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544379 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544381 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544384 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544386 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544389 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544392 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544394 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544397 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544399 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544402 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544405 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544408 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544411 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544413 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544416 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544419 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544422 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:20.550264 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544425 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544428 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544432 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544436 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544439 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544441 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544444 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544446 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544450 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544453 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544456 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544459 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544461 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544464 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544467 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544470 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544472 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544475 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544477 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544480 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:20.550790 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544482 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:20.551290 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544485 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:20.551290 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.544487 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:20.551290 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.545227 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:50:20.553082 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.553060 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 01:50:20.553122 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.553084 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553133 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553138 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553142 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553145 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553148 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553151 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553154 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:20.553154 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553158 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553160 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553163 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553166 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553170 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553175 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553178 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553181 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553183 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553186 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553189 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553191 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553194 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553197 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553200 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553202 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553205 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553209 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553211 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:20.553366 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553214 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553216 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553219 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553221 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553224 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553227 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553229 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553232 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553235 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553237 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553240 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553242 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553245 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553248 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553250 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553254 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553257 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553259 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553262 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553264 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:20.553841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553267 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553270 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553272 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553275 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553278 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553280 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553283 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553286 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553288 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553292 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553294 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553297 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553300 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553302 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553322 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553325 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553328 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553330 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553333 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553336 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:20.554465 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553339 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553341 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553344 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553346 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553349 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553352 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553355 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553358 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553362 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553366 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553369 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553372 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553376 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553379 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553382 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553386 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553388 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553391 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553394 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:20.554956 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553396 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.553401 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553499 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553504 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553507 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553509 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553512 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553514 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553517 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553520 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553522 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553525 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553528 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553530 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553533 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553536 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:20.555435 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553538 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553541 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553544 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553547 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553551 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553553 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553556 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553559 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553562 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553565 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553568 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553570 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553573 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553576 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553579 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553582 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553584 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553588 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553592 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:20.555841 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553595 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553597 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553600 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553602 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553605 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553607 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553610 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553613 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553616 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553618 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553621 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553623 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553626 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553629 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553631 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553634 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553636 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553639 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553642 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553645 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:20.556329 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553647 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553650 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553653 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553655 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553658 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553660 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553663 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553666 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553668 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553671 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553673 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553676 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553679 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553681 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553683 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553686 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553689 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553693 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553697 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:20.556835 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553700 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553703 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553705 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553708 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553711 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553714 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553717 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553720 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553723 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553725 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553728 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553730 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553733 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:20.553735 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.553740 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:50:20.557315 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.554463 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 01:50:20.557730 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.557234 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 01:50:20.558056 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.558044 2568 server.go:1019] "Starting client certificate rotation" Apr 21 01:50:20.558172 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.558153 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 01:50:20.558227 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.558215 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 01:50:20.581571 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.581546 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 01:50:20.585564 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.585540 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 01:50:20.600162 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.600143 2568 log.go:25] "Validated CRI v1 runtime API" Apr 21 01:50:20.605460 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.605439 2568 log.go:25] "Validated CRI v1 image API" Apr 21 01:50:20.607347 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.607332 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 01:50:20.608990 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.608971 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 01:50:20.610126 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.610105 2568 fs.go:135] Filesystem UUIDs: map[5a3e53db-0ed4-46ff-9332-e69f9bf763c6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7d0981bf-75fe-451a-8d32-a691ec5b04c7:/dev/nvme0n1p3] Apr 21 01:50:20.610197 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.610125 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 01:50:20.616461 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.616352 2568 manager.go:217] Machine: {Timestamp:2026-04-21 01:50:20.614610045 +0000 UTC m=+0.384572211 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101255 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec282f6a666b2f57a9a09c0c55412f5e SystemUUID:ec282f6a-666b-2f57-a9a0-9c0c55412f5e BootID:f8f5f6f5-3366-4910-a50e-4e225d7115a5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:69:7a:74:fe:6f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:69:7a:74:fe:6f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:97:21:fd:9d:b0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 01:50:20.616461 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.616456 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 01:50:20.616570 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.616539 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 01:50:20.618183 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.618158 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 01:50:20.618353 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.618186 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-35.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 01:50:20.618397 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.618361 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 01:50:20.618397 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.618369 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 01:50:20.618397 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.618382 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 01:50:20.619269 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.619258 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 01:50:20.620295 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.620285 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 21 01:50:20.620413 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.620404 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 01:50:20.622606 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.622597 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 21 01:50:20.622638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.622615 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 01:50:20.622638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.622628 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 01:50:20.622638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.622638 2568 kubelet.go:397] "Adding apiserver pod source" Apr 21 01:50:20.622722 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.622647 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 01:50:20.623580 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.623567 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 01:50:20.623625 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.623586 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 01:50:20.626369 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.626353 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 01:50:20.627602 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.627589 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 01:50:20.629468 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629453 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 01:50:20.629510 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629495 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 01:50:20.629510 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629506 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629514 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629522 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629527 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629534 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629539 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629547 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629553 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 01:50:20.629572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629571 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 01:50:20.629774 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.629580 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 01:50:20.630470 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.630452 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2tgfn" Apr 21 01:50:20.630523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.630477 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 01:50:20.630523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.630485 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 01:50:20.634274 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.634258 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 01:50:20.634367 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.634296 2568 server.go:1295] "Started kubelet" Apr 21 01:50:20.634441 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.634394 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 01:50:20.634511 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.634462 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 01:50:20.634559 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.634524 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 01:50:20.635277 ip-10-0-141-35 systemd[1]: Started Kubernetes Kubelet. Apr 21 01:50:20.638279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.638253 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 01:50:20.638513 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.638495 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2tgfn" Apr 21 01:50:20.638582 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.638563 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-35.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 01:50:20.638787 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.638766 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 01:50:20.638915 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.638887 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-35.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 01:50:20.640070 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.640052 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 21 01:50:20.645185 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645163 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 01:50:20.645291 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645177 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 01:50:20.645774 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645750 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 01:50:20.645774 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645775 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 01:50:20.645904 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645855 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 01:50:20.645904 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645898 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 21 01:50:20.645904 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.645905 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 21 01:50:20.646064 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646030 2568 factory.go:55] Registering systemd factory Apr 21 01:50:20.646125 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646089 2568 factory.go:223] Registration of the systemd container factory successfully Apr 21 01:50:20.646239 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.646218 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:20.646346 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646330 2568 factory.go:153] Registering CRI-O factory Apr 21 01:50:20.646434 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646348 2568 factory.go:223] Registration of the crio container factory successfully Apr 21 01:50:20.646434 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646400 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 01:50:20.646434 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646422 2568 factory.go:103] Registering Raw factory Apr 21 01:50:20.646559 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646437 2568 manager.go:1196] Started watching for new ooms in manager Apr 21 01:50:20.646869 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.646852 2568 manager.go:319] Starting recovery of all containers Apr 21 01:50:20.648508 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.648488 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:20.649947 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.649925 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 01:50:20.650571 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.650552 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-35.ec2.internal\" not found" node="ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.657230 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.657078 2568 manager.go:324] Recovery completed Apr 21 01:50:20.661512 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.661500 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:20.664278 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.664262 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:20.664414 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.664297 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:20.664414 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.664327 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:20.664850 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.664833 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 01:50:20.664850 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.664845 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 01:50:20.664946 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.664874 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 21 01:50:20.666965 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.666954 2568 policy_none.go:49] "None policy: Start" Apr 21 01:50:20.667012 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.666978 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 01:50:20.667012 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.666988 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.703975 2568 manager.go:341] "Starting Device Plugin manager" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.704019 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.704030 2568 server.go:85] "Starting device plugin registration server" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.704280 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.704294 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.704432 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.704537 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.704546 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.705049 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 01:50:20.708387 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.705079 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:20.770873 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.770841 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 01:50:20.772032 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.772015 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 01:50:20.772144 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.772047 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 01:50:20.772144 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.772071 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 01:50:20.772144 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.772078 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 01:50:20.772144 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.772109 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 01:50:20.774497 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.774468 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:20.805482 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.805429 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:20.806509 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.806494 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:20.806588 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.806525 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:20.806588 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.806538 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:20.806588 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.806561 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.817563 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.817550 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.817602 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.817571 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-35.ec2.internal\": node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:20.832095 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.832076 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:20.872786 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.872760 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal"] Apr 21 01:50:20.872846 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.872842 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:20.873747 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.873730 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:20.873820 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.873765 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:20.873820 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.873778 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:20.875107 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875093 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:20.875265 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875253 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.875303 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875279 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:20.875828 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875812 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:20.875922 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875830 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:20.875922 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875842 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:20.875922 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875853 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:20.875922 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875858 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:20.875922 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.875864 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:20.876993 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.876981 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.877081 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.877006 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:20.877673 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.877660 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:20.877734 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.877681 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:20.877734 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.877695 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:20.891880 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.891861 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-35.ec2.internal\" not found" node="ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.896133 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.896116 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-35.ec2.internal\" not found" node="ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.932960 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:20.932931 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:20.947762 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.947740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/26365148514fa10173edd0155a08a3fb-config\") pod \"kube-apiserver-proxy-ip-10-0-141-35.ec2.internal\" (UID: \"26365148514fa10173edd0155a08a3fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.947846 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.947766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d7b413176f7ee78f446a0aea3fee038-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal\" (UID: \"7d7b413176f7ee78f446a0aea3fee038\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:20.947846 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:20.947783 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d7b413176f7ee78f446a0aea3fee038-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal\" (UID: \"7d7b413176f7ee78f446a0aea3fee038\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.034084 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.034046 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.048405 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.048380 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/26365148514fa10173edd0155a08a3fb-config\") pod \"kube-apiserver-proxy-ip-10-0-141-35.ec2.internal\" (UID: \"26365148514fa10173edd0155a08a3fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.048476 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.048412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d7b413176f7ee78f446a0aea3fee038-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal\" (UID: \"7d7b413176f7ee78f446a0aea3fee038\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.048476 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.048430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d7b413176f7ee78f446a0aea3fee038-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal\" (UID: \"7d7b413176f7ee78f446a0aea3fee038\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.048476 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.048469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d7b413176f7ee78f446a0aea3fee038-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal\" (UID: \"7d7b413176f7ee78f446a0aea3fee038\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.048579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.048500 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d7b413176f7ee78f446a0aea3fee038-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal\" (UID: \"7d7b413176f7ee78f446a0aea3fee038\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.048579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.048538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/26365148514fa10173edd0155a08a3fb-config\") pod \"kube-apiserver-proxy-ip-10-0-141-35.ec2.internal\" (UID: \"26365148514fa10173edd0155a08a3fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.134784 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.134720 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.194252 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.194230 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.198803 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.198780 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.235379 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.235351 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.335876 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.335841 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.436389 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.436323 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.536829 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.536797 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.558265 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.558236 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 01:50:21.558432 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.558414 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:50:21.558483 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.558426 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:50:21.637756 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.637723 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.641586 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.641550 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 01:45:20 +0000 UTC" deadline="2027-11-10 19:35:46.625267018 +0000 UTC" Apr 21 01:50:21.641586 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.641577 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13649h45m24.983692991s" Apr 21 01:50:21.645436 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.645422 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 01:50:21.659955 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.659935 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 01:50:21.680524 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.680504 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m2khj" Apr 21 01:50:21.685687 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.685668 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m2khj" Apr 21 01:50:21.691546 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:21.691497 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7b413176f7ee78f446a0aea3fee038.slice/crio-9e4b21b850420c385b73e27d243de99ec3e2d913885654e6952202a9fafcf86c WatchSource:0}: Error finding container 9e4b21b850420c385b73e27d243de99ec3e2d913885654e6952202a9fafcf86c: Status 404 returned error can't find the container with id 9e4b21b850420c385b73e27d243de99ec3e2d913885654e6952202a9fafcf86c Apr 21 01:50:21.691760 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:21.691741 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26365148514fa10173edd0155a08a3fb.slice/crio-06b00dc559a7eee1e69bbdab35f332ad8b2de3c526382d525daecb966e280295 WatchSource:0}: Error finding container 06b00dc559a7eee1e69bbdab35f332ad8b2de3c526382d525daecb966e280295: Status 404 returned error can't find the container with id 06b00dc559a7eee1e69bbdab35f332ad8b2de3c526382d525daecb966e280295 Apr 21 01:50:21.697269 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.697256 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 01:50:21.738178 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:21.738154 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-35.ec2.internal\" not found" Apr 21 01:50:21.775420 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.775375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" event={"ID":"26365148514fa10173edd0155a08a3fb","Type":"ContainerStarted","Data":"06b00dc559a7eee1e69bbdab35f332ad8b2de3c526382d525daecb966e280295"} Apr 21 01:50:21.776210 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.776193 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" event={"ID":"7d7b413176f7ee78f446a0aea3fee038","Type":"ContainerStarted","Data":"9e4b21b850420c385b73e27d243de99ec3e2d913885654e6952202a9fafcf86c"} Apr 21 01:50:21.825824 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.825800 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:21.845572 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.845549 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.855087 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.855068 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 01:50:21.857417 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.857404 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" Apr 21 01:50:21.864638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.864624 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 01:50:21.926554 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:21.926529 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:22.392848 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.392818 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:22.489479 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.489452 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:22.623775 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.623745 2568 apiserver.go:52] "Watching apiserver" Apr 21 01:50:22.630881 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.630859 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 01:50:22.633018 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.632989 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-qwhhg","openshift-dns/node-resolver-shp9g","openshift-image-registry/node-ca-bv5lq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal","openshift-multus/multus-rmwq4","openshift-multus/multus-additional-cni-plugins-wzkbb","openshift-multus/network-metrics-daemon-pqvmq","openshift-network-diagnostics/network-check-target-z86fj","openshift-network-operator/iptables-alerter-kcf2g","openshift-ovn-kubernetes/ovnkube-node-bvzx2","kube-system/konnectivity-agent-rtnjq","kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w"] Apr 21 01:50:22.634992 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.634967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.636101 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.636081 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.637158 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.637135 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 01:50:22.637247 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.637141 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 01:50:22.637324 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.637254 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.637558 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.637481 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 01:50:22.637558 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.637505 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 01:50:22.637708 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.637616 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.638053 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.638032 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.638053 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.638045 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nc6t6\"" Apr 21 01:50:22.638228 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.638090 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.638228 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.638110 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.638493 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.638476 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4gxfw\"" Apr 21 01:50:22.638551 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.638508 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.639363 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.639341 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.639465 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.639389 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gnmgg\"" Apr 21 01:50:22.639667 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.639641 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 01:50:22.639979 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.639946 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.640252 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.640234 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.640694 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.640641 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.640694 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.640671 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.641012 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.640991 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 01:50:22.641580 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.640641 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 01:50:22.641580 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.641300 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f7p8j\"" Apr 21 01:50:22.644638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.644239 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:22.644638 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.644347 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:22.644638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.644442 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 01:50:22.644638 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.644485 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zbrlm\"" Apr 21 01:50:22.644867 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.644698 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 01:50:22.645773 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.645748 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:22.645864 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.645817 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:22.647024 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.646986 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.648272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.648251 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.648820 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.648792 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.649018 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.649000 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 01:50:22.649078 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.649032 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.649135 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.649119 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nhnc9\"" Apr 21 01:50:22.649859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.649561 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.650190 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.650153 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.650582 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.650561 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.650713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.650636 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fqxf6\"" Apr 21 01:50:22.651859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.651701 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 01:50:22.651859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.651807 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 01:50:22.651859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.651844 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rqt2m\"" Apr 21 01:50:22.652064 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.651939 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.653902 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.653882 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h2t9f\"" Apr 21 01:50:22.653995 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.653949 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 01:50:22.654097 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.654079 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 01:50:22.654215 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.654200 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 01:50:22.656740 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656721 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovnkube-config\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.656833 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656758 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcr54\" (UniqueName: \"kubernetes.io/projected/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-kube-api-access-mcr54\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.656833 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656783 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-system-cni-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.656833 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-cni-bin\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.656991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656842 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-cnibin\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.656991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656862 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94b3f448-6380-4226-b329-a7e8b2cad657-host\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.656991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-os-release\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.656991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656922 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-systemd\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.656991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656950 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-hosts-file\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.656991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.656973 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-etc-kubernetes\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657008 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4kkv\" (UniqueName: \"kubernetes.io/projected/333616b1-f960-4eb6-b4fd-448534b9cd3a-kube-api-access-h4kkv\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657043 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-system-cni-dir\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657118 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-daemon-config\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657157 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-ovn\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-node-log\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657207 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-cni-netd\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657230 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-env-overrides\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovn-node-metrics-cert\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657281 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj4sr\" (UniqueName: \"kubernetes.io/projected/94b3f448-6380-4226-b329-a7e8b2cad657-kube-api-access-dj4sr\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657321 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-cnibin\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-cni-binary-copy\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657370 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-cni-multus\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-cni-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657417 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-kubelet\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-hostroot\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657519 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-conf-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657547 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-systemd-units\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657575 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-slash\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovnkube-script-lib\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqfv\" (UniqueName: \"kubernetes.io/projected/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-kube-api-access-jtqfv\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.657718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657703 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-etc-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj6pt\" (UniqueName: \"kubernetes.io/projected/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-kube-api-access-cj6pt\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7bst\" (UniqueName: \"kubernetes.io/projected/446458a6-4d58-4666-88b6-92203ea344ee-kube-api-access-p7bst\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657810 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-cni-bin\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657836 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-tmp-dir\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94b3f448-6380-4226-b329-a7e8b2cad657-serviceca\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-k8s-cni-cncf-io\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-multus-certs\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.657981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-run-netns\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658006 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-var-lib-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/87063399-3100-43c9-a654-5ca49d106c2c-iptables-alerter-script\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658051 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-netns\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxsw\" (UniqueName: \"kubernetes.io/projected/87063399-3100-43c9-a654-5ca49d106c2c-kube-api-access-ddxsw\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658091 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:22.658454 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-kubelet\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.659174 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658121 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-log-socket\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.659174 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658141 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.659174 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658183 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-socket-dir-parent\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.659174 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658203 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87063399-3100-43c9-a654-5ca49d106c2c-host-slash\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.659174 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658225 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-os-release\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.659174 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.658261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.686270 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.686242 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 01:45:21 +0000 UTC" deadline="2027-12-29 13:34:50.057626192 +0000 UTC" Apr 21 01:50:22.686270 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.686269 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14819h44m27.371360139s" Apr 21 01:50:22.747115 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.747080 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 01:50:22.758439 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758585 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-tuned\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.758585 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-etc-selinux\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.758585 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-hostroot\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.758585 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758549 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-conf-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.758585 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758585 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-systemd-units\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758593 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-slash\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758598 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-hostroot\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758612 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-conf-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758616 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovnkube-script-lib\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-slash\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-systemd-units\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm6tk\" (UniqueName: \"kubernetes.io/projected/9414d859-6831-469a-aaca-dd269e3b122c-kube-api-access-lm6tk\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqfv\" (UniqueName: \"kubernetes.io/projected/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-kube-api-access-jtqfv\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-etc-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysctl-d\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/899e33b4-0828-4733-86e4-56750cb8ec32-agent-certs\") pod \"konnectivity-agent-rtnjq\" (UID: \"899e33b4-0828-4733-86e4-56750cb8ec32\") " pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.758858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758855 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758884 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj6pt\" (UniqueName: \"kubernetes.io/projected/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-kube-api-access-cj6pt\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758911 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7bst\" (UniqueName: \"kubernetes.io/projected/446458a6-4d58-4666-88b6-92203ea344ee-kube-api-access-p7bst\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758966 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-cni-bin\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.758995 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysconfig\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-tmp-dir\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-etc-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94b3f448-6380-4226-b329-a7e8b2cad657-serviceca\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-k8s-cni-cncf-io\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-multus-certs\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-run-netns\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-var-lib-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759224 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-run\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759250 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/87063399-3100-43c9-a654-5ca49d106c2c-iptables-alerter-script\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759282 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-netns\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.759488 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759289 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxsw\" (UniqueName: \"kubernetes.io/projected/87063399-3100-43c9-a654-5ca49d106c2c-kube-api-access-ddxsw\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-kubelet\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759379 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovnkube-script-lib\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-run-netns\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759399 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-log-socket\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759443 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-multus-certs\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-k8s-cni-cncf-io\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxwj\" (UniqueName: \"kubernetes.io/projected/0e6e0ca7-995a-4155-94d3-02572b72d57c-kube-api-access-tjxwj\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759513 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-var-lib-openvswitch\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759523 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/899e33b4-0828-4733-86e4-56750cb8ec32-konnectivity-ca\") pod \"konnectivity-agent-rtnjq\" (UID: \"899e33b4-0828-4733-86e4-56750cb8ec32\") " pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759523 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759555 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-socket-dir-parent\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759566 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/94b3f448-6380-4226-b329-a7e8b2cad657-serviceca\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.760279 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-socket-dir-parent\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759496 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-cni-bin\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-kubelet\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759792 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-run-netns\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-log-socket\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-tmp-dir\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87063399-3100-43c9-a654-5ca49d106c2c-host-slash\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-os-release\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87063399-3100-43c9-a654-5ca49d106c2c-host-slash\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759928 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-os-release\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovnkube-config\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759977 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcr54\" (UniqueName: \"kubernetes.io/projected/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-kube-api-access-mcr54\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.759986 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-kubernetes\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760027 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-system-cni-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.760933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-cni-bin\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760042 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/87063399-3100-43c9-a654-5ca49d106c2c-iptables-alerter-script\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-cnibin\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760102 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-system-cni-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760116 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-var-lib-kubelet\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760116 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-cnibin\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94b3f448-6380-4226-b329-a7e8b2cad657-host\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-cni-bin\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-os-release\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760166 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94b3f448-6380-4226-b329-a7e8b2cad657-host\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-systemd\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-lib-modules\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-systemd\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760210 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-os-release\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-registration-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760257 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-hosts-file\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760281 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-etc-kubernetes\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760324 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4kkv\" (UniqueName: \"kubernetes.io/projected/333616b1-f960-4eb6-b4fd-448534b9cd3a-kube-api-access-h4kkv\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:22.761624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760350 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-system-cni-dir\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-etc-kubernetes\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760406 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-hosts-file\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760392 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-systemd\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-sys\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760454 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/446458a6-4d58-4666-88b6-92203ea344ee-system-cni-dir\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-daemon-config\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-ovn\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-node-log\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-node-log\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760616 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-run-ovn\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-cni-netd\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-host-cni-netd\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760685 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-env-overrides\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovn-node-metrics-cert\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysctl-conf\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.762189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-socket-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj4sr\" (UniqueName: \"kubernetes.io/projected/94b3f448-6380-4226-b329-a7e8b2cad657-kube-api-access-dj4sr\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760832 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-cnibin\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-modprobe-d\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-host\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-device-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760951 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/446458a6-4d58-4666-88b6-92203ea344ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-cni-binary-copy\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.760983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-cni-multus\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-cni-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e6e0ca7-995a-4155-94d3-02572b72d57c-tmp\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-sys-fs\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-kubelet\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-daemon-config\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.761205 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761293 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-cnibin\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.762738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761298 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-cni-multus\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.761335 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:50:23.261300418 +0000 UTC m=+3.031262597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761356 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-multus-cni-dir\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-host-var-lib-kubelet\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-cni-binary-copy\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761713 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761729 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovnkube-config\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.763187 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.761843 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-env-overrides\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.764327 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.764295 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-ovn-node-metrics-cert\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.772731 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.772662 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:22.772731 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.772704 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:22.772731 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.772718 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f22tv for pod openshift-network-diagnostics/network-check-target-z86fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:22.773024 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:22.772804 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv podName:ee736496-b4a2-4832-ab28-516d69f51886 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:23.272775282 +0000 UTC m=+3.042737449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f22tv" (UniqueName: "kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv") pod "network-check-target-z86fj" (UID: "ee736496-b4a2-4832-ab28-516d69f51886") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:22.775275 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.775183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqfv\" (UniqueName: \"kubernetes.io/projected/4d5eb879-dd29-4c7f-8643-6f9a6b561eda-kube-api-access-jtqfv\") pod \"multus-rmwq4\" (UID: \"4d5eb879-dd29-4c7f-8643-6f9a6b561eda\") " pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.775857 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.775834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxsw\" (UniqueName: \"kubernetes.io/projected/87063399-3100-43c9-a654-5ca49d106c2c-kube-api-access-ddxsw\") pod \"iptables-alerter-kcf2g\" (UID: \"87063399-3100-43c9-a654-5ca49d106c2c\") " pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.775955 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.775860 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcr54\" (UniqueName: \"kubernetes.io/projected/8889bb55-ecc3-4f0f-b6a3-5c5f2e739440-kube-api-access-mcr54\") pod \"ovnkube-node-bvzx2\" (UID: \"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.776015 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.775966 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj4sr\" (UniqueName: \"kubernetes.io/projected/94b3f448-6380-4226-b329-a7e8b2cad657-kube-api-access-dj4sr\") pod \"node-ca-bv5lq\" (UID: \"94b3f448-6380-4226-b329-a7e8b2cad657\") " pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.776668 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.776337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj6pt\" (UniqueName: \"kubernetes.io/projected/d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8-kube-api-access-cj6pt\") pod \"node-resolver-shp9g\" (UID: \"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8\") " pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.776668 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.776442 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4kkv\" (UniqueName: \"kubernetes.io/projected/333616b1-f960-4eb6-b4fd-448534b9cd3a-kube-api-access-h4kkv\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:22.777205 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.777184 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7bst\" (UniqueName: \"kubernetes.io/projected/446458a6-4d58-4666-88b6-92203ea344ee-kube-api-access-p7bst\") pod \"multus-additional-cni-plugins-wzkbb\" (UID: \"446458a6-4d58-4666-88b6-92203ea344ee\") " pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.862397 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-registration-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-systemd\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862432 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-sys\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862458 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysctl-conf\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-socket-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862508 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-registration-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-systemd\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-modprobe-d\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-host\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-sys\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-device-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862623 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e6e0ca7-995a-4155-94d3-02572b72d57c-tmp\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862659 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-socket-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-sys-fs\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862687 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysctl-conf\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-device-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-modprobe-d\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-tuned\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862756 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-sys-fs\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-etc-selinux\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862779 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-host\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm6tk\" (UniqueName: \"kubernetes.io/projected/9414d859-6831-469a-aaca-dd269e3b122c-kube-api-access-lm6tk\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysctl-d\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-etc-selinux\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/899e33b4-0828-4733-86e4-56750cb8ec32-agent-certs\") pod \"konnectivity-agent-rtnjq\" (UID: \"899e33b4-0828-4733-86e4-56750cb8ec32\") " pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862916 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.862973 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysconfig\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-run\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9414d859-6831-469a-aaca-dd269e3b122c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.862978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysctl-d\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjxwj\" (UniqueName: \"kubernetes.io/projected/0e6e0ca7-995a-4155-94d3-02572b72d57c-kube-api-access-tjxwj\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-sysconfig\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/899e33b4-0828-4733-86e4-56750cb8ec32-konnectivity-ca\") pod \"konnectivity-agent-rtnjq\" (UID: \"899e33b4-0828-4733-86e4-56750cb8ec32\") " pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863122 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-run\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-kubernetes\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-var-lib-kubelet\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-lib-modules\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-kubernetes\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-var-lib-kubelet\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.863727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e6e0ca7-995a-4155-94d3-02572b72d57c-lib-modules\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.864368 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.863791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/899e33b4-0828-4733-86e4-56750cb8ec32-konnectivity-ca\") pod \"konnectivity-agent-rtnjq\" (UID: \"899e33b4-0828-4733-86e4-56750cb8ec32\") " pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.865511 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.865488 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e6e0ca7-995a-4155-94d3-02572b72d57c-etc-tuned\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.865511 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.865502 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e6e0ca7-995a-4155-94d3-02572b72d57c-tmp\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.866155 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.866130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/899e33b4-0828-4733-86e4-56750cb8ec32-agent-certs\") pod \"konnectivity-agent-rtnjq\" (UID: \"899e33b4-0828-4733-86e4-56750cb8ec32\") " pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:22.870215 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.870193 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjxwj\" (UniqueName: \"kubernetes.io/projected/0e6e0ca7-995a-4155-94d3-02572b72d57c-kube-api-access-tjxwj\") pod \"tuned-qwhhg\" (UID: \"0e6e0ca7-995a-4155-94d3-02572b72d57c\") " pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.870794 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.870769 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm6tk\" (UniqueName: \"kubernetes.io/projected/9414d859-6831-469a-aaca-dd269e3b122c-kube-api-access-lm6tk\") pod \"aws-ebs-csi-driver-node-bnf9w\" (UID: \"9414d859-6831-469a-aaca-dd269e3b122c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:22.947859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.947785 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:22.956951 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.956922 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-shp9g" Apr 21 01:50:22.963974 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.963954 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bv5lq" Apr 21 01:50:22.968490 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.968470 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rmwq4" Apr 21 01:50:22.975999 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.975983 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" Apr 21 01:50:22.983515 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.983499 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kcf2g" Apr 21 01:50:22.990052 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.990034 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" Apr 21 01:50:22.995569 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:22.995551 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:23.000660 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.000642 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" Apr 21 01:50:23.265852 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.265780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:23.265982 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:23.265918 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:23.265982 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:23.265972 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:50:24.265957576 +0000 UTC m=+4.035919733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:23.302195 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.302165 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod446458a6_4d58_4666_88b6_92203ea344ee.slice/crio-6ef14ad1b865b2ae31040048bb05f5c6e54e1a69f3c7bdce71d4b154df6f3d30 WatchSource:0}: Error finding container 6ef14ad1b865b2ae31040048bb05f5c6e54e1a69f3c7bdce71d4b154df6f3d30: Status 404 returned error can't find the container with id 6ef14ad1b865b2ae31040048bb05f5c6e54e1a69f3c7bdce71d4b154df6f3d30 Apr 21 01:50:23.303912 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.303887 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e6e0ca7_995a_4155_94d3_02572b72d57c.slice/crio-115b22ec13e4d2c530e0ba3e6f6765e1b530389e86eb9239ba8f7d2bd070f9a8 WatchSource:0}: Error finding container 115b22ec13e4d2c530e0ba3e6f6765e1b530389e86eb9239ba8f7d2bd070f9a8: Status 404 returned error can't find the container with id 115b22ec13e4d2c530e0ba3e6f6765e1b530389e86eb9239ba8f7d2bd070f9a8 Apr 21 01:50:23.307376 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.307356 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8889bb55_ecc3_4f0f_b6a3_5c5f2e739440.slice/crio-ac1cd954f2376467a9bc329f448caf5f3b361e743732bbba3d8f69a5e8d35a10 WatchSource:0}: Error finding container ac1cd954f2376467a9bc329f448caf5f3b361e743732bbba3d8f69a5e8d35a10: Status 404 returned error can't find the container with id ac1cd954f2376467a9bc329f448caf5f3b361e743732bbba3d8f69a5e8d35a10 Apr 21 01:50:23.308338 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.308299 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b3f448_6380_4226_b329_a7e8b2cad657.slice/crio-0bf1195574426e89aa9f185c0457a7a2f122f39a434ca5e5ef514bcf7b6e0647 WatchSource:0}: Error finding container 0bf1195574426e89aa9f185c0457a7a2f122f39a434ca5e5ef514bcf7b6e0647: Status 404 returned error can't find the container with id 0bf1195574426e89aa9f185c0457a7a2f122f39a434ca5e5ef514bcf7b6e0647 Apr 21 01:50:23.309024 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.308995 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd406d2d2_81f8_42f9_bc0e_baf9d5cdccc8.slice/crio-95dde6225030f7b5e07b7823a45d5741cb4b7eea756f84ba4255ddddd125557e WatchSource:0}: Error finding container 95dde6225030f7b5e07b7823a45d5741cb4b7eea756f84ba4255ddddd125557e: Status 404 returned error can't find the container with id 95dde6225030f7b5e07b7823a45d5741cb4b7eea756f84ba4255ddddd125557e Apr 21 01:50:23.309988 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.309963 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87063399_3100_43c9_a654_5ca49d106c2c.slice/crio-e0aaf7db67ad09c2308f88c32ae880535707e4eb79845110f0216cf271c96760 WatchSource:0}: Error finding container e0aaf7db67ad09c2308f88c32ae880535707e4eb79845110f0216cf271c96760: Status 404 returned error can't find the container with id e0aaf7db67ad09c2308f88c32ae880535707e4eb79845110f0216cf271c96760 Apr 21 01:50:23.310820 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.310777 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9414d859_6831_469a_aaca_dd269e3b122c.slice/crio-ca6a4c7313e01e925b7e19c221ed0329c62fbb9c068d503a1dc84907c27f337b WatchSource:0}: Error finding container ca6a4c7313e01e925b7e19c221ed0329c62fbb9c068d503a1dc84907c27f337b: Status 404 returned error can't find the container with id ca6a4c7313e01e925b7e19c221ed0329c62fbb9c068d503a1dc84907c27f337b Apr 21 01:50:23.314273 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.314197 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5eb879_dd29_4c7f_8643_6f9a6b561eda.slice/crio-d67b1d0a3367a378078fb26e3fd2e3448962f28b3adfefc03aed740f62d9340b WatchSource:0}: Error finding container d67b1d0a3367a378078fb26e3fd2e3448962f28b3adfefc03aed740f62d9340b: Status 404 returned error can't find the container with id d67b1d0a3367a378078fb26e3fd2e3448962f28b3adfefc03aed740f62d9340b Apr 21 01:50:23.314855 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:23.314493 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899e33b4_0828_4733_86e4_56750cb8ec32.slice/crio-62e5c471f8d12e4db9907f2f179607deb0cb658996afa2298910c234a64ff58f WatchSource:0}: Error finding container 62e5c471f8d12e4db9907f2f179607deb0cb658996afa2298910c234a64ff58f: Status 404 returned error can't find the container with id 62e5c471f8d12e4db9907f2f179607deb0cb658996afa2298910c234a64ff58f Apr 21 01:50:23.367210 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.367028 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:23.367363 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:23.367178 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:23.367363 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:23.367243 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:23.367363 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:23.367254 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f22tv for pod openshift-network-diagnostics/network-check-target-z86fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:23.367363 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:23.367317 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv podName:ee736496-b4a2-4832-ab28-516d69f51886 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:24.367289152 +0000 UTC m=+4.137251308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f22tv" (UniqueName: "kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv") pod "network-check-target-z86fj" (UID: "ee736496-b4a2-4832-ab28-516d69f51886") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:23.686680 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.686568 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 01:45:21 +0000 UTC" deadline="2027-12-25 13:15:11.870997115 +0000 UTC" Apr 21 01:50:23.686680 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.686609 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14723h24m48.184392262s" Apr 21 01:50:23.785609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.785551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"ac1cd954f2376467a9bc329f448caf5f3b361e743732bbba3d8f69a5e8d35a10"} Apr 21 01:50:23.790387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.790334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" event={"ID":"0e6e0ca7-995a-4155-94d3-02572b72d57c","Type":"ContainerStarted","Data":"115b22ec13e4d2c530e0ba3e6f6765e1b530389e86eb9239ba8f7d2bd070f9a8"} Apr 21 01:50:23.808245 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.808137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerStarted","Data":"6ef14ad1b865b2ae31040048bb05f5c6e54e1a69f3c7bdce71d4b154df6f3d30"} Apr 21 01:50:23.814700 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.814668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bv5lq" event={"ID":"94b3f448-6380-4226-b329-a7e8b2cad657","Type":"ContainerStarted","Data":"0bf1195574426e89aa9f185c0457a7a2f122f39a434ca5e5ef514bcf7b6e0647"} Apr 21 01:50:23.818402 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.818345 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rtnjq" event={"ID":"899e33b4-0828-4733-86e4-56750cb8ec32","Type":"ContainerStarted","Data":"62e5c471f8d12e4db9907f2f179607deb0cb658996afa2298910c234a64ff58f"} Apr 21 01:50:23.824380 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.824327 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rmwq4" event={"ID":"4d5eb879-dd29-4c7f-8643-6f9a6b561eda","Type":"ContainerStarted","Data":"d67b1d0a3367a378078fb26e3fd2e3448962f28b3adfefc03aed740f62d9340b"} Apr 21 01:50:23.833739 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.833691 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" event={"ID":"9414d859-6831-469a-aaca-dd269e3b122c","Type":"ContainerStarted","Data":"ca6a4c7313e01e925b7e19c221ed0329c62fbb9c068d503a1dc84907c27f337b"} Apr 21 01:50:23.837037 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.836993 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kcf2g" event={"ID":"87063399-3100-43c9-a654-5ca49d106c2c","Type":"ContainerStarted","Data":"e0aaf7db67ad09c2308f88c32ae880535707e4eb79845110f0216cf271c96760"} Apr 21 01:50:23.845484 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.845419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-shp9g" event={"ID":"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8","Type":"ContainerStarted","Data":"95dde6225030f7b5e07b7823a45d5741cb4b7eea756f84ba4255ddddd125557e"} Apr 21 01:50:23.856381 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.855613 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" event={"ID":"26365148514fa10173edd0155a08a3fb","Type":"ContainerStarted","Data":"2412f9fdf724814d95e9511c1dece271b4fef700cd4d3cae4f4edc28fe731876"} Apr 21 01:50:23.869880 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:23.869820 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-35.ec2.internal" podStartSLOduration=2.869805763 podStartE2EDuration="2.869805763s" podCreationTimestamp="2026-04-21 01:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:50:23.869577922 +0000 UTC m=+3.639540098" watchObservedRunningTime="2026-04-21 01:50:23.869805763 +0000 UTC m=+3.639767942" Apr 21 01:50:24.275100 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:24.275062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:24.275272 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.275251 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:24.275353 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.275328 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:50:26.275294208 +0000 UTC m=+6.045256364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:24.376200 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:24.376111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:24.376398 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.376281 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:24.376398 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.376302 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:24.376398 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.376330 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f22tv for pod openshift-network-diagnostics/network-check-target-z86fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:24.376398 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.376392 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv podName:ee736496-b4a2-4832-ab28-516d69f51886 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:26.3763722 +0000 UTC m=+6.146334355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f22tv" (UniqueName: "kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv") pod "network-check-target-z86fj" (UID: "ee736496-b4a2-4832-ab28-516d69f51886") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:24.775169 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:24.775092 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:24.775631 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.775216 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:24.775702 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:24.775650 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:24.775785 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:24.775762 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:24.882006 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:24.880749 2568 generic.go:358] "Generic (PLEG): container finished" podID="7d7b413176f7ee78f446a0aea3fee038" containerID="909275c303ec60e2b7e70650b05db89cbe95f587d62859ddcf4ec6427d2660a8" exitCode=0 Apr 21 01:50:24.882006 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:24.880950 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" event={"ID":"7d7b413176f7ee78f446a0aea3fee038","Type":"ContainerDied","Data":"909275c303ec60e2b7e70650b05db89cbe95f587d62859ddcf4ec6427d2660a8"} Apr 21 01:50:25.885776 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:25.885737 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" event={"ID":"7d7b413176f7ee78f446a0aea3fee038","Type":"ContainerStarted","Data":"269e684f59b52ba24455683b69555807dfee736e856a0cbf4a5772c39e8359f9"} Apr 21 01:50:25.899524 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:25.899474 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-35.ec2.internal" podStartSLOduration=4.899445496 podStartE2EDuration="4.899445496s" podCreationTimestamp="2026-04-21 01:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:50:25.89942334 +0000 UTC m=+5.669385516" watchObservedRunningTime="2026-04-21 01:50:25.899445496 +0000 UTC m=+5.669407673" Apr 21 01:50:26.292179 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:26.292052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:26.292348 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.292268 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:26.292428 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.292361 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:50:30.292340901 +0000 UTC m=+10.062303069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:26.393876 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:26.393260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:26.393876 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.393448 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:26.393876 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.393468 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:26.393876 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.393480 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f22tv for pod openshift-network-diagnostics/network-check-target-z86fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:26.393876 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.393545 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv podName:ee736496-b4a2-4832-ab28-516d69f51886 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:30.393527105 +0000 UTC m=+10.163489262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f22tv" (UniqueName: "kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv") pod "network-check-target-z86fj" (UID: "ee736496-b4a2-4832-ab28-516d69f51886") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:26.773291 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:26.772758 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:26.773291 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.772931 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:26.773291 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:26.772758 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:26.773291 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:26.773066 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:28.772568 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:28.772515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:28.773012 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:28.772651 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:28.773086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:28.772515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:28.773137 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:28.773110 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:30.326520 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:30.326364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:30.326992 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.326556 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:30.326992 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.326634 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:50:38.326605378 +0000 UTC m=+18.096567542 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:30.427367 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:30.427329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:30.427540 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.427520 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:30.427602 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.427542 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:30.427602 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.427555 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f22tv for pod openshift-network-diagnostics/network-check-target-z86fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:30.427716 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.427616 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv podName:ee736496-b4a2-4832-ab28-516d69f51886 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:38.427597627 +0000 UTC m=+18.197559791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f22tv" (UniqueName: "kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv") pod "network-check-target-z86fj" (UID: "ee736496-b4a2-4832-ab28-516d69f51886") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:30.773892 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:30.773403 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:30.773892 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.773527 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:30.773892 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:30.773674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:30.773892 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:30.773784 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:32.772631 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:32.772592 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:32.772631 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:32.772621 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:32.773130 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:32.772709 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:32.773130 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:32.772769 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:34.773049 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:34.773014 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:34.773611 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:34.773057 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:34.773611 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:34.773153 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:34.773611 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:34.773285 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:36.772845 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:36.772808 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:36.773288 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:36.772814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:36.773288 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:36.772949 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:36.773288 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:36.773034 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:38.387039 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:38.387003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:38.387419 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.387121 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:38.387419 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.387171 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:50:54.387158158 +0000 UTC m=+34.157120312 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:38.488120 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:38.488079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:38.488294 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.488219 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:38.488294 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.488239 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:38.488294 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.488250 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f22tv for pod openshift-network-diagnostics/network-check-target-z86fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:38.488439 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.488325 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv podName:ee736496-b4a2-4832-ab28-516d69f51886 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:54.488292841 +0000 UTC m=+34.258255008 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f22tv" (UniqueName: "kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv") pod "network-check-target-z86fj" (UID: "ee736496-b4a2-4832-ab28-516d69f51886") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:38.773047 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:38.772953 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:38.773220 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.773160 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:38.773274 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:38.773226 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:38.773383 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:38.773346 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:40.773006 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.772854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:40.773509 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.772906 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:40.773509 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:40.773084 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:40.773509 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:40.773185 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:40.912971 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.912942 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bv5lq" event={"ID":"94b3f448-6380-4226-b329-a7e8b2cad657","Type":"ContainerStarted","Data":"e5b5d11fc9da05f4d9f15083b5e48b922da5a6eebe3cac575d273730fc79682e"} Apr 21 01:50:40.914176 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.914154 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rtnjq" event={"ID":"899e33b4-0828-4733-86e4-56750cb8ec32","Type":"ContainerStarted","Data":"2b51a95a5e8f948599885403fff94aae913b17d24b3402b3f707ae0ace717aa9"} Apr 21 01:50:40.915296 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.915274 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rmwq4" event={"ID":"4d5eb879-dd29-4c7f-8643-6f9a6b561eda","Type":"ContainerStarted","Data":"4b1c76423c22e02c468f52379024035934fc175d83a98d65cb5c0080c89e4ca9"} Apr 21 01:50:40.916435 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.916417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" event={"ID":"9414d859-6831-469a-aaca-dd269e3b122c","Type":"ContainerStarted","Data":"f180c9d31f21f0580d6c11d96100544b69f8974cfe7014b8e550f1dc72284a47"} Apr 21 01:50:40.917433 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.917414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-shp9g" event={"ID":"d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8","Type":"ContainerStarted","Data":"319fa549cc57474671c43063f867ee8ff717bf83f95eb87a7f1d3b369e05e35c"} Apr 21 01:50:40.918834 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.918819 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:50:40.919109 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.919082 2568 generic.go:358] "Generic (PLEG): container finished" podID="8889bb55-ecc3-4f0f-b6a3-5c5f2e739440" containerID="99cc4806f0b7b61aef718b2ec2bd3606b479a1938b92dbcfea1e916a391c8f60" exitCode=1 Apr 21 01:50:40.919168 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.919132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"0d2bce0beae0e97bd20dedfb057c9b983e889740b25b01b25a24ae6d63c07807"} Apr 21 01:50:40.919168 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.919146 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerDied","Data":"99cc4806f0b7b61aef718b2ec2bd3606b479a1938b92dbcfea1e916a391c8f60"} Apr 21 01:50:40.919168 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.919156 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"8ec951fc3dca0fca4be39bd4935ac49a0816a213e82cab94e23e1e262d887817"} Apr 21 01:50:40.920267 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.920246 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" event={"ID":"0e6e0ca7-995a-4155-94d3-02572b72d57c","Type":"ContainerStarted","Data":"a6b62a4de73c6e392430c6c86b143cd9337e9277cb8c8cb4c835f43b1ac74441"} Apr 21 01:50:40.921371 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.921355 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerStarted","Data":"70c15830ce50c8cc4024bb49705bbb089981b38c54c9af23cfd12e7376a8ba01"} Apr 21 01:50:40.925980 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.925948 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bv5lq" podStartSLOduration=12.089004872 podStartE2EDuration="20.92593873s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.310904625 +0000 UTC m=+3.080866779" lastFinishedPulling="2026-04-21 01:50:32.14783847 +0000 UTC m=+11.917800637" observedRunningTime="2026-04-21 01:50:40.925518673 +0000 UTC m=+20.695480849" watchObservedRunningTime="2026-04-21 01:50:40.92593873 +0000 UTC m=+20.695900904" Apr 21 01:50:40.938331 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.937999 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-shp9g" podStartSLOduration=3.914341295 podStartE2EDuration="20.937982921s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.31147687 +0000 UTC m=+3.081439023" lastFinishedPulling="2026-04-21 01:50:40.335118475 +0000 UTC m=+20.105080649" observedRunningTime="2026-04-21 01:50:40.937950972 +0000 UTC m=+20.707913149" watchObservedRunningTime="2026-04-21 01:50:40.937982921 +0000 UTC m=+20.707945096" Apr 21 01:50:40.954185 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.954143 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rmwq4" podStartSLOduration=3.926629954 podStartE2EDuration="20.954131858s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.316736523 +0000 UTC m=+3.086698691" lastFinishedPulling="2026-04-21 01:50:40.344238437 +0000 UTC m=+20.114200595" observedRunningTime="2026-04-21 01:50:40.9537841 +0000 UTC m=+20.723746271" watchObservedRunningTime="2026-04-21 01:50:40.954131858 +0000 UTC m=+20.724094030" Apr 21 01:50:40.993447 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:40.992735 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qwhhg" podStartSLOduration=3.963736828 podStartE2EDuration="20.992715799s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.306211649 +0000 UTC m=+3.076173803" lastFinishedPulling="2026-04-21 01:50:40.335190619 +0000 UTC m=+20.105152774" observedRunningTime="2026-04-21 01:50:40.992530959 +0000 UTC m=+20.762493135" watchObservedRunningTime="2026-04-21 01:50:40.992715799 +0000 UTC m=+20.762677977" Apr 21 01:50:41.005732 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.005684 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rtnjq" podStartSLOduration=4.017768655 podStartE2EDuration="21.005663967s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.317375299 +0000 UTC m=+3.087337473" lastFinishedPulling="2026-04-21 01:50:40.305270618 +0000 UTC m=+20.075232785" observedRunningTime="2026-04-21 01:50:41.005636119 +0000 UTC m=+20.775598295" watchObservedRunningTime="2026-04-21 01:50:41.005663967 +0000 UTC m=+20.775626144" Apr 21 01:50:41.887222 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.887076 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 01:50:41.924331 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.924249 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" event={"ID":"9414d859-6831-469a-aaca-dd269e3b122c","Type":"ContainerStarted","Data":"ffbbe7414680d8e0cdaafbc65c78f65d59b2a5e066fc69b92ecb59cdc8acf013"} Apr 21 01:50:41.925559 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.925524 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kcf2g" event={"ID":"87063399-3100-43c9-a654-5ca49d106c2c","Type":"ContainerStarted","Data":"1ff3de20a4c46fe133798b9e65d3b570332625b29ae549b1a8297cb582252cc6"} Apr 21 01:50:41.925669 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.925654 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:41.926282 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.926263 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:41.927793 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.927778 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:50:41.928069 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.928052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"3f835efdd3e2e5922e6928de80b5dbe3871e15b8b7c3f4077d006e836040cbdb"} Apr 21 01:50:41.928110 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.928077 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"37bf672fb17f3c6cd774f76a88162c20368f186bdc9565233b85140a9d598230"} Apr 21 01:50:41.928110 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.928089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"9d8bd15162a87aaf39b580f39a9b964672db27c0896426149d7c18383768325c"} Apr 21 01:50:41.929291 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.929272 2568 generic.go:358] "Generic (PLEG): container finished" podID="446458a6-4d58-4666-88b6-92203ea344ee" containerID="70c15830ce50c8cc4024bb49705bbb089981b38c54c9af23cfd12e7376a8ba01" exitCode=0 Apr 21 01:50:41.929406 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.929366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerDied","Data":"70c15830ce50c8cc4024bb49705bbb089981b38c54c9af23cfd12e7376a8ba01"} Apr 21 01:50:41.929624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.929607 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:41.930029 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.930015 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rtnjq" Apr 21 01:50:41.938939 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:41.938905 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kcf2g" podStartSLOduration=4.917387751 podStartE2EDuration="21.938895826s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.313241593 +0000 UTC m=+3.083203748" lastFinishedPulling="2026-04-21 01:50:40.334749664 +0000 UTC m=+20.104711823" observedRunningTime="2026-04-21 01:50:41.938337379 +0000 UTC m=+21.708299551" watchObservedRunningTime="2026-04-21 01:50:41.938895826 +0000 UTC m=+21.708858000" Apr 21 01:50:42.718153 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:42.718047 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T01:50:41.887215262Z","UUID":"20dcb547-10e8-47de-afb6-bbec644ec438","Handler":null,"Name":"","Endpoint":""} Apr 21 01:50:42.720124 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:42.720100 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 01:50:42.720242 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:42.720133 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 01:50:42.772550 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:42.772520 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:42.772710 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:42.772638 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:42.772710 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:42.772658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:42.772836 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:42.772782 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:43.935278 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:43.935054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" event={"ID":"9414d859-6831-469a-aaca-dd269e3b122c","Type":"ContainerStarted","Data":"d91b86d87ac0d499b447ac5bcd0f0ce599d7e965073eb4b9cd6d2c08a4c3a4c0"} Apr 21 01:50:43.937911 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:43.937887 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:50:43.938362 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:43.938333 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"e2801b72ff40d0db21488b6de20930ab82331247d50a2e958134152c16c921c6"} Apr 21 01:50:43.950808 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:43.950760 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bnf9w" podStartSLOduration=4.069317718 podStartE2EDuration="23.950748248s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.315678467 +0000 UTC m=+3.085640620" lastFinishedPulling="2026-04-21 01:50:43.197108987 +0000 UTC m=+22.967071150" observedRunningTime="2026-04-21 01:50:43.950693662 +0000 UTC m=+23.720655839" watchObservedRunningTime="2026-04-21 01:50:43.950748248 +0000 UTC m=+23.720710406" Apr 21 01:50:44.772324 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:44.772278 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:44.772492 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:44.772278 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:44.772492 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:44.772398 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:44.772614 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:44.772484 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:45.946705 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:45.946470 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:50:45.947505 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:45.947114 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"2c364dbbf72582498d48593cb07a8e1128c8de43e3fd9771c9b19cc86963150d"} Apr 21 01:50:45.948240 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:45.947610 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:45.948240 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:45.947743 2568 scope.go:117] "RemoveContainer" containerID="99cc4806f0b7b61aef718b2ec2bd3606b479a1938b92dbcfea1e916a391c8f60" Apr 21 01:50:45.962506 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:45.962481 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:46.772503 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.772471 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:46.772669 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.772475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:46.772669 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:46.772563 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:46.772669 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:46.772635 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:46.951565 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.951538 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:50:46.951908 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.951879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" event={"ID":"8889bb55-ecc3-4f0f-b6a3-5c5f2e739440","Type":"ContainerStarted","Data":"f39665396fc1d29f70bcd53f6b27b4809ccdc1fe409e5da9a0ece097a71e1ad1"} Apr 21 01:50:46.952172 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.952152 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:46.952172 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.952180 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:46.953451 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.953427 2568 generic.go:358] "Generic (PLEG): container finished" podID="446458a6-4d58-4666-88b6-92203ea344ee" containerID="73bc7ecbf3cfac61e9949111b3294dd70056882fac30c8b691fd560f2b9ae626" exitCode=0 Apr 21 01:50:46.953537 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.953464 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerDied","Data":"73bc7ecbf3cfac61e9949111b3294dd70056882fac30c8b691fd560f2b9ae626"} Apr 21 01:50:46.966544 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.966495 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:50:46.976832 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:46.976798 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" podStartSLOduration=9.896171207 podStartE2EDuration="26.976788197s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.309212316 +0000 UTC m=+3.079174470" lastFinishedPulling="2026-04-21 01:50:40.389829293 +0000 UTC m=+20.159791460" observedRunningTime="2026-04-21 01:50:46.97667435 +0000 UTC m=+26.746636525" watchObservedRunningTime="2026-04-21 01:50:46.976788197 +0000 UTC m=+26.746750385" Apr 21 01:50:47.782201 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:47.782048 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z86fj"] Apr 21 01:50:47.782342 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:47.782288 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:47.782438 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:47.782385 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:47.784767 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:47.784746 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pqvmq"] Apr 21 01:50:47.784877 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:47.784836 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:47.784941 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:47.784905 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:47.957075 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:47.957039 2568 generic.go:358] "Generic (PLEG): container finished" podID="446458a6-4d58-4666-88b6-92203ea344ee" containerID="4c1379d5263d30d50cd8e4abe69b927651a3fe5aefcd1e64e5d10157724fee1e" exitCode=0 Apr 21 01:50:47.957457 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:47.957126 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerDied","Data":"4c1379d5263d30d50cd8e4abe69b927651a3fe5aefcd1e64e5d10157724fee1e"} Apr 21 01:50:48.960921 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:48.960891 2568 generic.go:358] "Generic (PLEG): container finished" podID="446458a6-4d58-4666-88b6-92203ea344ee" containerID="7248d824601a0b11b46c1393f2a361fed9ebb8a94054ef7181223e3ea63d8b05" exitCode=0 Apr 21 01:50:48.961356 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:48.960975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerDied","Data":"7248d824601a0b11b46c1393f2a361fed9ebb8a94054ef7181223e3ea63d8b05"} Apr 21 01:50:49.772613 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:49.772583 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:49.772765 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:49.772585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:49.772765 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:49.772697 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:49.772877 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:49.772830 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:51.772420 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:51.772389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:51.772972 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:51.772389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:51.772972 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:51.772499 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z86fj" podUID="ee736496-b4a2-4832-ab28-516d69f51886" Apr 21 01:50:51.772972 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:51.772572 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pqvmq" podUID="333616b1-f960-4eb6-b4fd-448534b9cd3a" Apr 21 01:50:53.549445 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.549366 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-35.ec2.internal" event="NodeReady" Apr 21 01:50:53.550041 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.549518 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 01:50:53.587670 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.587615 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fjfcj"] Apr 21 01:50:53.590597 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.590577 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9zg7k"] Apr 21 01:50:53.590774 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.590751 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.592431 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.592414 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:53.593069 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.593051 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 01:50:53.593165 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.593089 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 01:50:53.593165 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.593089 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vd9s5\"" Apr 21 01:50:53.594634 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.594615 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 01:50:53.594732 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.594643 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 01:50:53.594732 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.594663 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 01:50:53.594732 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.594724 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbqdn\"" Apr 21 01:50:53.600287 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.600261 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fjfcj"] Apr 21 01:50:53.614232 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.614212 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9zg7k"] Apr 21 01:50:53.695999 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.695964 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.696182 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.696010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:53.696182 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.696086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-tmp-dir\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.696182 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.696170 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5h2s\" (UniqueName: \"kubernetes.io/projected/7d936ad0-666c-49cb-8f95-d04607cc5b52-kube-api-access-j5h2s\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:53.696351 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.696214 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-config-volume\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.696351 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.696246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmbl\" (UniqueName: \"kubernetes.io/projected/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-kube-api-access-6bmbl\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.773022 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.772991 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:53.773199 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.773001 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:53.775984 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.775946 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 01:50:53.775984 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.775970 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb98s\"" Apr 21 01:50:53.775984 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.775977 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68xr4\"" Apr 21 01:50:53.776204 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.776135 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 01:50:53.776258 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.776242 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 01:50:53.796820 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.796792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.796955 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.796833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:53.796955 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.796861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-tmp-dir\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.796955 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.796908 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5h2s\" (UniqueName: \"kubernetes.io/projected/7d936ad0-666c-49cb-8f95-d04607cc5b52-kube-api-access-j5h2s\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:53.796955 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.796938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-config-volume\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.796955 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:53.796947 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:53.797188 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.796961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmbl\" (UniqueName: \"kubernetes.io/projected/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-kube-api-access-6bmbl\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.797188 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:53.796947 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:53.797188 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:53.797023 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:54.297001921 +0000 UTC m=+34.066964305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:50:53.797188 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:53.797106 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:50:54.297085037 +0000 UTC m=+34.067047475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:50:53.797461 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.797240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-tmp-dir\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.797526 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.797509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-config-volume\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.810655 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.810589 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmbl\" (UniqueName: \"kubernetes.io/projected/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-kube-api-access-6bmbl\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:53.810772 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:53.810691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5h2s\" (UniqueName: \"kubernetes.io/projected/7d936ad0-666c-49cb-8f95-d04607cc5b52-kube-api-access-j5h2s\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:54.302118 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.302088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:54.302118 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.302121 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:54.302317 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:54.302232 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:54.302317 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:54.302285 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:55.302271616 +0000 UTC m=+35.072233768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:50:54.302317 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:54.302233 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:54.302419 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:54.302357 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:50:55.302341245 +0000 UTC m=+35.072303410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:50:54.402437 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.402408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:50:54.402598 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:54.402520 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 01:50:54.402598 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:54.402568 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:51:26.402555862 +0000 UTC m=+66.172518015 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : secret "metrics-daemon-secret" not found Apr 21 01:50:54.503065 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.503041 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:54.505939 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.505913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f22tv\" (UniqueName: \"kubernetes.io/projected/ee736496-b4a2-4832-ab28-516d69f51886-kube-api-access-f22tv\") pod \"network-check-target-z86fj\" (UID: \"ee736496-b4a2-4832-ab28-516d69f51886\") " pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:54.684979 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.684947 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:50:54.854875 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.854705 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z86fj"] Apr 21 01:50:54.858262 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:50:54.858232 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee736496_b4a2_4832_ab28_516d69f51886.slice/crio-39b715efbaabb96b032903a3376eb50ed61270e29e2799193922d900de709655 WatchSource:0}: Error finding container 39b715efbaabb96b032903a3376eb50ed61270e29e2799193922d900de709655: Status 404 returned error can't find the container with id 39b715efbaabb96b032903a3376eb50ed61270e29e2799193922d900de709655 Apr 21 01:50:54.978865 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.978828 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z86fj" event={"ID":"ee736496-b4a2-4832-ab28-516d69f51886","Type":"ContainerStarted","Data":"39b715efbaabb96b032903a3376eb50ed61270e29e2799193922d900de709655"} Apr 21 01:50:54.981130 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.981098 2568 generic.go:358] "Generic (PLEG): container finished" podID="446458a6-4d58-4666-88b6-92203ea344ee" containerID="cae1cce3f7bd4b8d43253c359368d8ab2d91e6ae8901c34bc836d2b8f1c3aed0" exitCode=0 Apr 21 01:50:54.981244 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:54.981143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerDied","Data":"cae1cce3f7bd4b8d43253c359368d8ab2d91e6ae8901c34bc836d2b8f1c3aed0"} Apr 21 01:50:55.309066 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:55.309025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:55.309245 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:55.309070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:55.309245 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:55.309181 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:55.309367 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:55.309255 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:50:57.309234529 +0000 UTC m=+37.079196688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:50:55.309367 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:55.309189 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:55.309367 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:55.309347 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:57.30933003 +0000 UTC m=+37.079292189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:50:55.985846 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:55.985812 2568 generic.go:358] "Generic (PLEG): container finished" podID="446458a6-4d58-4666-88b6-92203ea344ee" containerID="4c920de5079f1f4477d2def5fe99022f47c0147fcd1cf09cc00f8abf2c5c32bf" exitCode=0 Apr 21 01:50:55.986250 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:55.985861 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerDied","Data":"4c920de5079f1f4477d2def5fe99022f47c0147fcd1cf09cc00f8abf2c5c32bf"} Apr 21 01:50:56.991154 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:56.991116 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" event={"ID":"446458a6-4d58-4666-88b6-92203ea344ee","Type":"ContainerStarted","Data":"5aaac1c682ca563cb9ad7747be9b016ba4a3271c864534e18cebad9abb7a243f"} Apr 21 01:50:57.012037 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:57.011983 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wzkbb" podStartSLOduration=5.847463074 podStartE2EDuration="37.011963994s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:23.30452621 +0000 UTC m=+3.074488378" lastFinishedPulling="2026-04-21 01:50:54.469027142 +0000 UTC m=+34.238989298" observedRunningTime="2026-04-21 01:50:57.010231234 +0000 UTC m=+36.780193413" watchObservedRunningTime="2026-04-21 01:50:57.011963994 +0000 UTC m=+36.781926175" Apr 21 01:50:57.323927 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:57.323895 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:50:57.324072 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:57.323939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:50:57.324072 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:57.324055 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:57.324072 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:57.324058 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:57.324210 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:57.324113 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:01.324097123 +0000 UTC m=+41.094059276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:50:57.324210 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:50:57.324132 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:51:01.324122487 +0000 UTC m=+41.094084645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:50:57.994669 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:57.994631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z86fj" event={"ID":"ee736496-b4a2-4832-ab28-516d69f51886","Type":"ContainerStarted","Data":"d70711be04ce664e19a9f61af42782783363d4c867238e75d0a1c8521e59e24e"} Apr 21 01:50:58.008910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:58.008806 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z86fj" podStartSLOduration=35.110102058 podStartE2EDuration="38.008793981s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:54.860406402 +0000 UTC m=+34.630368557" lastFinishedPulling="2026-04-21 01:50:57.759098324 +0000 UTC m=+37.529060480" observedRunningTime="2026-04-21 01:50:58.008025622 +0000 UTC m=+37.777987798" watchObservedRunningTime="2026-04-21 01:50:58.008793981 +0000 UTC m=+37.778756155" Apr 21 01:50:58.996801 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:50:58.996773 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:51:01.352703 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:01.352669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:51:01.352703 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:01.352703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:51:01.353167 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:01.352800 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:51:01.353167 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:01.352809 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:51:01.353167 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:01.352851 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:09.352836336 +0000 UTC m=+49.122798489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:51:01.353167 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:01.352864 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:51:09.35285796 +0000 UTC m=+49.122820113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:51:09.408534 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:09.408500 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:51:09.408534 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:09.408535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:51:09.408996 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:09.408634 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:51:09.408996 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:09.408635 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:51:09.408996 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:09.408694 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:51:25.40867942 +0000 UTC m=+65.178641574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:51:09.408996 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:09.408708 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:25.408702492 +0000 UTC m=+65.178664645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:51:18.971541 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:18.971513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvzx2" Apr 21 01:51:25.509457 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:25.509424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:51:25.509457 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:25.509465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:51:25.509990 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:25.509563 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:51:25.509990 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:25.509619 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:51:25.509990 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:25.509632 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls podName:82bc4cd8-ea54-4a2a-ae0f-172581c8dace nodeName:}" failed. No retries permitted until 2026-04-21 01:51:57.509617849 +0000 UTC m=+97.279580003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls") pod "dns-default-fjfcj" (UID: "82bc4cd8-ea54-4a2a-ae0f-172581c8dace") : secret "dns-default-metrics-tls" not found Apr 21 01:51:25.509990 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:25.509669 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert podName:7d936ad0-666c-49cb-8f95-d04607cc5b52 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:57.509654105 +0000 UTC m=+97.279616257 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert") pod "ingress-canary-9zg7k" (UID: "7d936ad0-666c-49cb-8f95-d04607cc5b52") : secret "canary-serving-cert" not found Apr 21 01:51:26.416075 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:26.416034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:51:26.416243 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:26.416187 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 01:51:26.416283 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:26.416264 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs podName:333616b1-f960-4eb6-b4fd-448534b9cd3a nodeName:}" failed. No retries permitted until 2026-04-21 01:52:30.416247301 +0000 UTC m=+130.186209454 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs") pod "network-metrics-daemon-pqvmq" (UID: "333616b1-f960-4eb6-b4fd-448534b9cd3a") : secret "metrics-daemon-secret" not found Apr 21 01:51:30.001398 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:30.001369 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z86fj" Apr 21 01:51:32.463893 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.463856 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv"] Apr 21 01:51:32.467073 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.467020 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps"] Apr 21 01:51:32.467339 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.467301 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" Apr 21 01:51:32.469694 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.469676 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.470201 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.470175 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bfwxr\"" Apr 21 01:51:32.470354 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.470333 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.470653 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.470200 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-cqf6k"] Apr 21 01:51:32.471261 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.470240 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.473492 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.473474 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.473876 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.473856 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jhbjd\"" Apr 21 01:51:32.474365 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.474346 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 01:51:32.474443 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.474353 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 01:51:32.474443 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.474398 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.474734 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.474716 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fx5mm"] Apr 21 01:51:32.474853 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.474839 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.477060 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.477040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 01:51:32.477154 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.477136 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 01:51:32.477154 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.477146 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.477402 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.477386 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.478236 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.478150 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps"] Apr 21 01:51:32.478236 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.478199 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv"] Apr 21 01:51:32.478434 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.478241 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.478434 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.478329 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gc9rm\"" Apr 21 01:51:32.480883 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.480867 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 01:51:32.481150 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.481128 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.481646 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.481627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-22kt2\"" Apr 21 01:51:32.482081 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.482058 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 01:51:32.482202 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.482184 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.484510 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.484487 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-cqf6k"] Apr 21 01:51:32.484598 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.484523 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fx5mm"] Apr 21 01:51:32.485146 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.485124 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 01:51:32.487533 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.487514 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 01:51:32.561266 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.561246 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj"] Apr 21 01:51:32.564344 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.564330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.566438 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.566415 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-cmm4d\"" Apr 21 01:51:32.566438 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.566431 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 01:51:32.566588 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.566461 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.566588 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.566465 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 01:51:32.566588 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.566508 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.568005 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.567971 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz"] Apr 21 01:51:32.571058 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.571041 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm"] Apr 21 01:51:32.571322 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.571201 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:32.573338 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.573301 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.573460 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.573441 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.573776 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.573760 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-thrrr\"" Apr 21 01:51:32.575582 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.573906 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 01:51:32.576629 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.576612 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj"] Apr 21 01:51:32.576726 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.576715 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.578668 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.578648 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm"] Apr 21 01:51:32.578777 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.578708 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6hxv6\"" Apr 21 01:51:32.578777 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.578770 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.579258 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.579237 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.579663 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.579639 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz"] Apr 21 01:51:32.580017 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.579998 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 01:51:32.580192 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.580002 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 01:51:32.660276 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660249 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz88z\" (UniqueName: \"kubernetes.io/projected/b78c9a0f-a030-444c-886f-a49679306c25-kube-api-access-dz88z\") pod \"volume-data-source-validator-7c6cbb6c87-f65qv\" (UID: \"b78c9a0f-a030-444c-886f-a49679306c25\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" Apr 21 01:51:32.660425 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660280 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fca9b761-dba2-40df-99e6-41e04c0a7ffb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.660425 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c0ff07-3052-4608-8a7c-4b86babf4ea2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.660425 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91998d5-ea6d-4d46-8984-013ce4758689-serving-cert\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.660425 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660411 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c0ff07-3052-4608-8a7c-4b86babf4ea2-service-ca-bundle\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/35c0ff07-3052-4608-8a7c-4b86babf4ea2-snapshots\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e91998d5-ea6d-4d46-8984-013ce4758689-trusted-ca\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660526 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35c0ff07-3052-4608-8a7c-4b86babf4ea2-serving-cert\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660552 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5kh\" (UniqueName: \"kubernetes.io/projected/e91998d5-ea6d-4d46-8984-013ce4758689-kube-api-access-gz5kh\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fca9b761-dba2-40df-99e6-41e04c0a7ffb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660587 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35c0ff07-3052-4608-8a7c-4b86babf4ea2-tmp\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.660608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91998d5-ea6d-4d46-8984-013ce4758689-config\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.660826 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n44d\" (UniqueName: \"kubernetes.io/projected/fca9b761-dba2-40df-99e6-41e04c0a7ffb-kube-api-access-4n44d\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.660826 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.660646 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvcs\" (UniqueName: \"kubernetes.io/projected/35c0ff07-3052-4608-8a7c-4b86babf4ea2-kube-api-access-rcvcs\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.761870 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.761799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35c0ff07-3052-4608-8a7c-4b86babf4ea2-serving-cert\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.761870 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.761830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890e70ae-a9fe-418c-8f2f-c3e3ba235674-config\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.761870 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.761852 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5kh\" (UniqueName: \"kubernetes.io/projected/e91998d5-ea6d-4d46-8984-013ce4758689-kube-api-access-gz5kh\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.761975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fca9b761-dba2-40df-99e6-41e04c0a7ffb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.761996 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35c0ff07-3052-4608-8a7c-4b86babf4ea2-tmp\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91998d5-ea6d-4d46-8984-013ce4758689-config\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n44d\" (UniqueName: \"kubernetes.io/projected/fca9b761-dba2-40df-99e6-41e04c0a7ffb-kube-api-access-4n44d\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvcs\" (UniqueName: \"kubernetes.io/projected/35c0ff07-3052-4608-8a7c-4b86babf4ea2-kube-api-access-rcvcs\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762114 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890e70ae-a9fe-418c-8f2f-c3e3ba235674-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.762160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5b5m\" (UniqueName: \"kubernetes.io/projected/890e70ae-a9fe-418c-8f2f-c3e3ba235674-kube-api-access-j5b5m\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz88z\" (UniqueName: \"kubernetes.io/projected/b78c9a0f-a030-444c-886f-a49679306c25-kube-api-access-dz88z\") pod \"volume-data-source-validator-7c6cbb6c87-f65qv\" (UID: \"b78c9a0f-a030-444c-886f-a49679306c25\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fca9b761-dba2-40df-99e6-41e04c0a7ffb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c0ff07-3052-4608-8a7c-4b86babf4ea2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91998d5-ea6d-4d46-8984-013ce4758689-serving-cert\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c0ff07-3052-4608-8a7c-4b86babf4ea2-service-ca-bundle\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/35c0ff07-3052-4608-8a7c-4b86babf4ea2-snapshots\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldb2\" (UniqueName: \"kubernetes.io/projected/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-kube-api-access-gldb2\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762455 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/35c0ff07-3052-4608-8a7c-4b86babf4ea2-tmp\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmf6\" (UniqueName: \"kubernetes.io/projected/cd5f55d0-df55-4f02-98ff-e867d88cad49-kube-api-access-fdmf6\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e91998d5-ea6d-4d46-8984-013ce4758689-trusted-ca\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.762607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762563 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:32.763078 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fca9b761-dba2-40df-99e6-41e04c0a7ffb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.763078 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762874 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91998d5-ea6d-4d46-8984-013ce4758689-config\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.763078 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.762981 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c0ff07-3052-4608-8a7c-4b86babf4ea2-service-ca-bundle\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.763261 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.763087 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/35c0ff07-3052-4608-8a7c-4b86babf4ea2-snapshots\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.763261 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.763167 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c0ff07-3052-4608-8a7c-4b86babf4ea2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.763729 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.763712 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e91998d5-ea6d-4d46-8984-013ce4758689-trusted-ca\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.765933 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.765907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fca9b761-dba2-40df-99e6-41e04c0a7ffb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.766011 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.765929 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91998d5-ea6d-4d46-8984-013ce4758689-serving-cert\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.766011 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.765964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35c0ff07-3052-4608-8a7c-4b86babf4ea2-serving-cert\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.769518 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.769495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvcs\" (UniqueName: \"kubernetes.io/projected/35c0ff07-3052-4608-8a7c-4b86babf4ea2-kube-api-access-rcvcs\") pod \"insights-operator-585dfdc468-cqf6k\" (UID: \"35c0ff07-3052-4608-8a7c-4b86babf4ea2\") " pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.769603 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.769501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n44d\" (UniqueName: \"kubernetes.io/projected/fca9b761-dba2-40df-99e6-41e04c0a7ffb-kube-api-access-4n44d\") pod \"kube-storage-version-migrator-operator-6769c5d45-75sps\" (UID: \"fca9b761-dba2-40df-99e6-41e04c0a7ffb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.769668 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.769653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5kh\" (UniqueName: \"kubernetes.io/projected/e91998d5-ea6d-4d46-8984-013ce4758689-kube-api-access-gz5kh\") pod \"console-operator-9d4b6777b-fx5mm\" (UID: \"e91998d5-ea6d-4d46-8984-013ce4758689\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.770623 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.770605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz88z\" (UniqueName: \"kubernetes.io/projected/b78c9a0f-a030-444c-886f-a49679306c25-kube-api-access-dz88z\") pod \"volume-data-source-validator-7c6cbb6c87-f65qv\" (UID: \"b78c9a0f-a030-444c-886f-a49679306c25\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" Apr 21 01:51:32.780660 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.780641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" Apr 21 01:51:32.788244 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.788221 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" Apr 21 01:51:32.793901 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.793880 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" Apr 21 01:51:32.798523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.798502 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866718 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gldb2\" (UniqueName: \"kubernetes.io/projected/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-kube-api-access-gldb2\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmf6\" (UniqueName: \"kubernetes.io/projected/cd5f55d0-df55-4f02-98ff-e867d88cad49-kube-api-access-fdmf6\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890e70ae-a9fe-418c-8f2f-c3e3ba235674-config\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866929 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890e70ae-a9fe-418c-8f2f-c3e3ba235674-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.866953 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5b5m\" (UniqueName: \"kubernetes.io/projected/890e70ae-a9fe-418c-8f2f-c3e3ba235674-kube-api-access-j5b5m\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:32.868396 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:32.868475 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls podName:00c23ae5-f0a8-414b-9e12-1dfa9725e21a nodeName:}" failed. No retries permitted until 2026-04-21 01:51:33.368455313 +0000 UTC m=+73.138417478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tbrdm" (UID: "00c23ae5-f0a8-414b-9e12-1dfa9725e21a") : secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.868621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890e70ae-a9fe-418c-8f2f-c3e3ba235674-config\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:32.868708 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 01:51:32.868797 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:32.868767 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls podName:cd5f55d0-df55-4f02-98ff-e867d88cad49 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:33.368750567 +0000 UTC m=+73.138712720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-76shz" (UID: "cd5f55d0-df55-4f02-98ff-e867d88cad49") : secret "samples-operator-tls" not found Apr 21 01:51:32.870321 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.870254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.878161 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.878111 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/890e70ae-a9fe-418c-8f2f-c3e3ba235674-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.891957 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.891418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldb2\" (UniqueName: \"kubernetes.io/projected/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-kube-api-access-gldb2\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:32.891957 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.891905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5b5m\" (UniqueName: \"kubernetes.io/projected/890e70ae-a9fe-418c-8f2f-c3e3ba235674-kube-api-access-j5b5m\") pod \"service-ca-operator-d6fc45fc5-8k7zj\" (UID: \"890e70ae-a9fe-418c-8f2f-c3e3ba235674\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:32.892751 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.892706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmf6\" (UniqueName: \"kubernetes.io/projected/cd5f55d0-df55-4f02-98ff-e867d88cad49-kube-api-access-fdmf6\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:32.963735 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:32.963701 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fx5mm"] Apr 21 01:51:32.967293 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:32.967266 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode91998d5_ea6d_4d46_8984_013ce4758689.slice/crio-ec9a87432d3bab0a526fccb2ceb8abacf86daab3f33e003cf274966c696fec95 WatchSource:0}: Error finding container ec9a87432d3bab0a526fccb2ceb8abacf86daab3f33e003cf274966c696fec95: Status 404 returned error can't find the container with id ec9a87432d3bab0a526fccb2ceb8abacf86daab3f33e003cf274966c696fec95 Apr 21 01:51:33.060372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.060279 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" event={"ID":"e91998d5-ea6d-4d46-8984-013ce4758689","Type":"ContainerStarted","Data":"ec9a87432d3bab0a526fccb2ceb8abacf86daab3f33e003cf274966c696fec95"} Apr 21 01:51:33.178367 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.178338 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" Apr 21 01:51:33.186869 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.186723 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv"] Apr 21 01:51:33.187903 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.187871 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-cqf6k"] Apr 21 01:51:33.189028 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.189005 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps"] Apr 21 01:51:33.189537 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:33.189514 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca9b761_dba2_40df_99e6_41e04c0a7ffb.slice/crio-db21f0d5a1e836c40911b2df379162614767b2c6cbee192e23888ea1cb4937f9 WatchSource:0}: Error finding container db21f0d5a1e836c40911b2df379162614767b2c6cbee192e23888ea1cb4937f9: Status 404 returned error can't find the container with id db21f0d5a1e836c40911b2df379162614767b2c6cbee192e23888ea1cb4937f9 Apr 21 01:51:33.190234 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:33.190195 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c0ff07_3052_4608_8a7c_4b86babf4ea2.slice/crio-8644512af808f41e14b2c48f67452759270fc75ea5e6c490101a5e0ad0b6f717 WatchSource:0}: Error finding container 8644512af808f41e14b2c48f67452759270fc75ea5e6c490101a5e0ad0b6f717: Status 404 returned error can't find the container with id 8644512af808f41e14b2c48f67452759270fc75ea5e6c490101a5e0ad0b6f717 Apr 21 01:51:33.190905 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:33.190880 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78c9a0f_a030_444c_886f_a49679306c25.slice/crio-262a31222d797f3af8b3a008515126ae5df57203ee32d8d6b07c59ecd137cd4e WatchSource:0}: Error finding container 262a31222d797f3af8b3a008515126ae5df57203ee32d8d6b07c59ecd137cd4e: Status 404 returned error can't find the container with id 262a31222d797f3af8b3a008515126ae5df57203ee32d8d6b07c59ecd137cd4e Apr 21 01:51:33.287280 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.287187 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj"] Apr 21 01:51:33.291072 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:33.291043 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890e70ae_a9fe_418c_8f2f_c3e3ba235674.slice/crio-eb26f0d80bd1f8033e44652e68954cb673f4a1bedf9b95a64d4f34d12d4bcd05 WatchSource:0}: Error finding container eb26f0d80bd1f8033e44652e68954cb673f4a1bedf9b95a64d4f34d12d4bcd05: Status 404 returned error can't find the container with id eb26f0d80bd1f8033e44652e68954cb673f4a1bedf9b95a64d4f34d12d4bcd05 Apr 21 01:51:33.369298 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.369276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:33.369425 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:33.369363 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:33.369425 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:33.369415 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:33.369491 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:33.369451 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 01:51:33.369491 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:33.369477 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls podName:00c23ae5-f0a8-414b-9e12-1dfa9725e21a nodeName:}" failed. No retries permitted until 2026-04-21 01:51:34.369462628 +0000 UTC m=+74.139424781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tbrdm" (UID: "00c23ae5-f0a8-414b-9e12-1dfa9725e21a") : secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:33.369563 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:33.369493 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls podName:cd5f55d0-df55-4f02-98ff-e867d88cad49 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:34.369485423 +0000 UTC m=+74.139447576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-76shz" (UID: "cd5f55d0-df55-4f02-98ff-e867d88cad49") : secret "samples-operator-tls" not found Apr 21 01:51:34.064657 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.064551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" event={"ID":"890e70ae-a9fe-418c-8f2f-c3e3ba235674","Type":"ContainerStarted","Data":"eb26f0d80bd1f8033e44652e68954cb673f4a1bedf9b95a64d4f34d12d4bcd05"} Apr 21 01:51:34.067139 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.067080 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" event={"ID":"35c0ff07-3052-4608-8a7c-4b86babf4ea2","Type":"ContainerStarted","Data":"8644512af808f41e14b2c48f67452759270fc75ea5e6c490101a5e0ad0b6f717"} Apr 21 01:51:34.068534 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.068481 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" event={"ID":"b78c9a0f-a030-444c-886f-a49679306c25","Type":"ContainerStarted","Data":"262a31222d797f3af8b3a008515126ae5df57203ee32d8d6b07c59ecd137cd4e"} Apr 21 01:51:34.071098 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.071041 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" event={"ID":"fca9b761-dba2-40df-99e6-41e04c0a7ffb","Type":"ContainerStarted","Data":"db21f0d5a1e836c40911b2df379162614767b2c6cbee192e23888ea1cb4937f9"} Apr 21 01:51:34.189201 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.188143 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq"] Apr 21 01:51:34.191439 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.191407 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" Apr 21 01:51:34.203340 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.203269 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq"] Apr 21 01:51:34.203474 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.203436 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-65ps4\"" Apr 21 01:51:34.277340 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.277226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l4zf\" (UniqueName: \"kubernetes.io/projected/0021f619-a97e-4a5d-abda-e2dd3c7a3b80-kube-api-access-5l4zf\") pod \"network-check-source-8894fc9bd-l6sfq\" (UID: \"0021f619-a97e-4a5d-abda-e2dd3c7a3b80\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" Apr 21 01:51:34.378450 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.377638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:34.378450 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.377765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5l4zf\" (UniqueName: \"kubernetes.io/projected/0021f619-a97e-4a5d-abda-e2dd3c7a3b80-kube-api-access-5l4zf\") pod \"network-check-source-8894fc9bd-l6sfq\" (UID: \"0021f619-a97e-4a5d-abda-e2dd3c7a3b80\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" Apr 21 01:51:34.378450 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.377801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:34.378450 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:34.377937 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 01:51:34.378450 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:34.378002 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls podName:cd5f55d0-df55-4f02-98ff-e867d88cad49 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:36.377982448 +0000 UTC m=+76.147944608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-76shz" (UID: "cd5f55d0-df55-4f02-98ff-e867d88cad49") : secret "samples-operator-tls" not found Apr 21 01:51:34.378450 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:34.378404 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:34.378893 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:34.378467 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls podName:00c23ae5-f0a8-414b-9e12-1dfa9725e21a nodeName:}" failed. No retries permitted until 2026-04-21 01:51:36.378451586 +0000 UTC m=+76.148413743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tbrdm" (UID: "00c23ae5-f0a8-414b-9e12-1dfa9725e21a") : secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:34.391868 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.391780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l4zf\" (UniqueName: \"kubernetes.io/projected/0021f619-a97e-4a5d-abda-e2dd3c7a3b80-kube-api-access-5l4zf\") pod \"network-check-source-8894fc9bd-l6sfq\" (UID: \"0021f619-a97e-4a5d-abda-e2dd3c7a3b80\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" Apr 21 01:51:34.519521 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:34.519484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" Apr 21 01:51:36.393103 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:36.393064 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:36.393639 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:36.393139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:36.393639 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:36.393233 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 01:51:36.393639 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:36.393257 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:36.393639 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:36.393320 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls podName:cd5f55d0-df55-4f02-98ff-e867d88cad49 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:40.393287529 +0000 UTC m=+80.163249688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-76shz" (UID: "cd5f55d0-df55-4f02-98ff-e867d88cad49") : secret "samples-operator-tls" not found Apr 21 01:51:36.393639 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:36.393341 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls podName:00c23ae5-f0a8-414b-9e12-1dfa9725e21a nodeName:}" failed. No retries permitted until 2026-04-21 01:51:40.393332072 +0000 UTC m=+80.163294225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tbrdm" (UID: "00c23ae5-f0a8-414b-9e12-1dfa9725e21a") : secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:36.965480 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:36.965280 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq"] Apr 21 01:51:36.970784 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:36.970741 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0021f619_a97e_4a5d_abda_e2dd3c7a3b80.slice/crio-512c341e19a3bbba5319bfb046a3fd78a8cf45e2ea1a86096c30b9c41995bc57 WatchSource:0}: Error finding container 512c341e19a3bbba5319bfb046a3fd78a8cf45e2ea1a86096c30b9c41995bc57: Status 404 returned error can't find the container with id 512c341e19a3bbba5319bfb046a3fd78a8cf45e2ea1a86096c30b9c41995bc57 Apr 21 01:51:37.080281 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.080214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" event={"ID":"890e70ae-a9fe-418c-8f2f-c3e3ba235674","Type":"ContainerStarted","Data":"06af163a64127e84c85b89ddd310c663ba608b7b28f1dfbb2cdb1e44b4729bfe"} Apr 21 01:51:37.081781 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.081754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" event={"ID":"35c0ff07-3052-4608-8a7c-4b86babf4ea2","Type":"ContainerStarted","Data":"996dfab57e96cb6d279aa581157f815a5980cf2aadda8610e6e1c9d2a9610f95"} Apr 21 01:51:37.083203 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.083157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" event={"ID":"b78c9a0f-a030-444c-886f-a49679306c25","Type":"ContainerStarted","Data":"bc3dab4ed62367cfb6bb0958d853fde574d71efd3bdb300b101466777e389ff1"} Apr 21 01:51:37.084969 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.084933 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/0.log" Apr 21 01:51:37.085064 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.084970 2568 generic.go:358] "Generic (PLEG): container finished" podID="e91998d5-ea6d-4d46-8984-013ce4758689" containerID="c1cfa341fb0d49959105519d77889168361c6405559501d1aa42210fe26de47e" exitCode=255 Apr 21 01:51:37.085064 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.085038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" event={"ID":"e91998d5-ea6d-4d46-8984-013ce4758689","Type":"ContainerDied","Data":"c1cfa341fb0d49959105519d77889168361c6405559501d1aa42210fe26de47e"} Apr 21 01:51:37.085261 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.085242 2568 scope.go:117] "RemoveContainer" containerID="c1cfa341fb0d49959105519d77889168361c6405559501d1aa42210fe26de47e" Apr 21 01:51:37.088149 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.087701 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" event={"ID":"fca9b761-dba2-40df-99e6-41e04c0a7ffb","Type":"ContainerStarted","Data":"9ed4de03891cd87486173fb0973d8bfda0054db30784b3ebf2eb7bbabd15cd2b"} Apr 21 01:51:37.089419 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.089397 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" event={"ID":"0021f619-a97e-4a5d-abda-e2dd3c7a3b80","Type":"ContainerStarted","Data":"e2ccc6b7796f86b4a2082029badec0edc6a39d2c6ad7de24d4779af12fe5b726"} Apr 21 01:51:37.089515 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.089424 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" event={"ID":"0021f619-a97e-4a5d-abda-e2dd3c7a3b80","Type":"ContainerStarted","Data":"512c341e19a3bbba5319bfb046a3fd78a8cf45e2ea1a86096c30b9c41995bc57"} Apr 21 01:51:37.094543 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.093818 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" podStartSLOduration=1.5460813249999998 podStartE2EDuration="5.09380512s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:51:33.292925807 +0000 UTC m=+73.062887960" lastFinishedPulling="2026-04-21 01:51:36.840649591 +0000 UTC m=+76.610611755" observedRunningTime="2026-04-21 01:51:37.093661676 +0000 UTC m=+76.863623852" watchObservedRunningTime="2026-04-21 01:51:37.09380512 +0000 UTC m=+76.863767296" Apr 21 01:51:37.114963 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.114837 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f65qv" podStartSLOduration=1.4688766979999999 podStartE2EDuration="5.114821173s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:51:33.192841679 +0000 UTC m=+72.962803832" lastFinishedPulling="2026-04-21 01:51:36.838786141 +0000 UTC m=+76.608748307" observedRunningTime="2026-04-21 01:51:37.114176924 +0000 UTC m=+76.884139100" watchObservedRunningTime="2026-04-21 01:51:37.114821173 +0000 UTC m=+76.884783348" Apr 21 01:51:37.129281 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.129160 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-l6sfq" podStartSLOduration=3.129144886 podStartE2EDuration="3.129144886s" podCreationTimestamp="2026-04-21 01:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:51:37.128655623 +0000 UTC m=+76.898617802" watchObservedRunningTime="2026-04-21 01:51:37.129144886 +0000 UTC m=+76.899107063" Apr 21 01:51:37.145556 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.145453 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" podStartSLOduration=1.4938954930000001 podStartE2EDuration="5.145436865s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:51:33.191905429 +0000 UTC m=+72.961867597" lastFinishedPulling="2026-04-21 01:51:36.843446812 +0000 UTC m=+76.613408969" observedRunningTime="2026-04-21 01:51:37.144284901 +0000 UTC m=+76.914247084" watchObservedRunningTime="2026-04-21 01:51:37.145436865 +0000 UTC m=+76.915399041" Apr 21 01:51:37.177607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:37.177424 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" podStartSLOduration=1.5237131750000001 podStartE2EDuration="5.177407674s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:51:33.192144389 +0000 UTC m=+72.962106551" lastFinishedPulling="2026-04-21 01:51:36.845838888 +0000 UTC m=+76.615801050" observedRunningTime="2026-04-21 01:51:37.176174382 +0000 UTC m=+76.946136558" watchObservedRunningTime="2026-04-21 01:51:37.177407674 +0000 UTC m=+76.947369853" Apr 21 01:51:38.094758 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.094725 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 01:51:38.095214 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.095194 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/0.log" Apr 21 01:51:38.095258 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.095240 2568 generic.go:358] "Generic (PLEG): container finished" podID="e91998d5-ea6d-4d46-8984-013ce4758689" containerID="92b5530e79095bf3cf599da4b6c095b76b4f96df2a8c8d668848c6380827359f" exitCode=255 Apr 21 01:51:38.095406 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.095378 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" event={"ID":"e91998d5-ea6d-4d46-8984-013ce4758689","Type":"ContainerDied","Data":"92b5530e79095bf3cf599da4b6c095b76b4f96df2a8c8d668848c6380827359f"} Apr 21 01:51:38.095550 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.095530 2568 scope.go:117] "RemoveContainer" containerID="c1cfa341fb0d49959105519d77889168361c6405559501d1aa42210fe26de47e" Apr 21 01:51:38.095647 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.095626 2568 scope.go:117] "RemoveContainer" containerID="92b5530e79095bf3cf599da4b6c095b76b4f96df2a8c8d668848c6380827359f" Apr 21 01:51:38.095854 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:38.095833 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fx5mm_openshift-console-operator(e91998d5-ea6d-4d46-8984-013ce4758689)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" podUID="e91998d5-ea6d-4d46-8984-013ce4758689" Apr 21 01:51:38.397607 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.397528 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx"] Apr 21 01:51:38.402036 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.402011 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" Apr 21 01:51:38.404445 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.404422 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 01:51:38.404524 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.404445 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 01:51:38.404597 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.404580 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-jkfcp\"" Apr 21 01:51:38.409478 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.409458 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx"] Apr 21 01:51:38.410849 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.410760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxjr\" (UniqueName: \"kubernetes.io/projected/c50383dd-d7ca-429b-a12e-4dea1a2761b8-kube-api-access-8gxjr\") pod \"migrator-74bb7799d9-sp8qx\" (UID: \"c50383dd-d7ca-429b-a12e-4dea1a2761b8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" Apr 21 01:51:38.511421 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.511373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxjr\" (UniqueName: \"kubernetes.io/projected/c50383dd-d7ca-429b-a12e-4dea1a2761b8-kube-api-access-8gxjr\") pod \"migrator-74bb7799d9-sp8qx\" (UID: \"c50383dd-d7ca-429b-a12e-4dea1a2761b8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" Apr 21 01:51:38.519636 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.519606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxjr\" (UniqueName: \"kubernetes.io/projected/c50383dd-d7ca-429b-a12e-4dea1a2761b8-kube-api-access-8gxjr\") pod \"migrator-74bb7799d9-sp8qx\" (UID: \"c50383dd-d7ca-429b-a12e-4dea1a2761b8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" Apr 21 01:51:38.713820 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.713732 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" Apr 21 01:51:38.843371 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:38.843330 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx"] Apr 21 01:51:38.848226 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:38.848193 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50383dd_d7ca_429b_a12e_4dea1a2761b8.slice/crio-65f1e1f8afc0d45c84f64e668344c1cc38f024d26b61f4e568f1870b6fd39561 WatchSource:0}: Error finding container 65f1e1f8afc0d45c84f64e668344c1cc38f024d26b61f4e568f1870b6fd39561: Status 404 returned error can't find the container with id 65f1e1f8afc0d45c84f64e668344c1cc38f024d26b61f4e568f1870b6fd39561 Apr 21 01:51:39.098537 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:39.098511 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 01:51:39.098935 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:39.098879 2568 scope.go:117] "RemoveContainer" containerID="92b5530e79095bf3cf599da4b6c095b76b4f96df2a8c8d668848c6380827359f" Apr 21 01:51:39.099135 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:39.099111 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fx5mm_openshift-console-operator(e91998d5-ea6d-4d46-8984-013ce4758689)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" podUID="e91998d5-ea6d-4d46-8984-013ce4758689" Apr 21 01:51:39.099632 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:39.099609 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" event={"ID":"c50383dd-d7ca-429b-a12e-4dea1a2761b8","Type":"ContainerStarted","Data":"65f1e1f8afc0d45c84f64e668344c1cc38f024d26b61f4e568f1870b6fd39561"} Apr 21 01:51:40.103768 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:40.103725 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" event={"ID":"c50383dd-d7ca-429b-a12e-4dea1a2761b8","Type":"ContainerStarted","Data":"5bd76d799224f8d6c3fdc73dc925244b39f46325593dea55309b52292e6a205c"} Apr 21 01:51:40.103768 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:40.103761 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" event={"ID":"c50383dd-d7ca-429b-a12e-4dea1a2761b8","Type":"ContainerStarted","Data":"1e1aa18f5032c7c4293139ef2cd1a6efe425ae59cf2ed090161564945d7b4b95"} Apr 21 01:51:40.117695 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:40.117654 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sp8qx" podStartSLOduration=1.068495175 podStartE2EDuration="2.1176409s" podCreationTimestamp="2026-04-21 01:51:38 +0000 UTC" firstStartedPulling="2026-04-21 01:51:38.850296998 +0000 UTC m=+78.620259154" lastFinishedPulling="2026-04-21 01:51:39.899442722 +0000 UTC m=+79.669404879" observedRunningTime="2026-04-21 01:51:40.116784202 +0000 UTC m=+79.886746373" watchObservedRunningTime="2026-04-21 01:51:40.1176409 +0000 UTC m=+79.887603074" Apr 21 01:51:40.428512 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:40.428474 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:40.428686 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:40.428529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:40.428686 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:40.428612 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 01:51:40.428686 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:40.428640 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:40.428686 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:40.428675 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls podName:cd5f55d0-df55-4f02-98ff-e867d88cad49 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:48.428660653 +0000 UTC m=+88.198622807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-76shz" (UID: "cd5f55d0-df55-4f02-98ff-e867d88cad49") : secret "samples-operator-tls" not found Apr 21 01:51:40.428820 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:40.428690 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls podName:00c23ae5-f0a8-414b-9e12-1dfa9725e21a nodeName:}" failed. No retries permitted until 2026-04-21 01:51:48.428683601 +0000 UTC m=+88.198645754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tbrdm" (UID: "00c23ae5-f0a8-414b-9e12-1dfa9725e21a") : secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:41.574738 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:41.574712 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-shp9g_d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8/dns-node-resolver/0.log" Apr 21 01:51:42.375771 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:42.375741 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bv5lq_94b3f448-6380-4226-b329-a7e8b2cad657/node-ca/0.log" Apr 21 01:51:42.799046 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:42.798970 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:42.799046 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:42.799006 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:42.799432 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:42.799344 2568 scope.go:117] "RemoveContainer" containerID="92b5530e79095bf3cf599da4b6c095b76b4f96df2a8c8d668848c6380827359f" Apr 21 01:51:42.799521 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:42.799503 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fx5mm_openshift-console-operator(e91998d5-ea6d-4d46-8984-013ce4758689)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" podUID="e91998d5-ea6d-4d46-8984-013ce4758689" Apr 21 01:51:43.775011 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:43.774978 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sp8qx_c50383dd-d7ca-429b-a12e-4dea1a2761b8/migrator/0.log" Apr 21 01:51:43.974670 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:43.974640 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sp8qx_c50383dd-d7ca-429b-a12e-4dea1a2761b8/graceful-termination/0.log" Apr 21 01:51:44.175427 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:44.175386 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-75sps_fca9b761-dba2-40df-99e6-41e04c0a7ffb/kube-storage-version-migrator-operator/0.log" Apr 21 01:51:48.492621 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:48.492572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:48.493026 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:48.492643 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:51:48.493026 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:48.492778 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:48.493026 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:51:48.492841 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls podName:00c23ae5-f0a8-414b-9e12-1dfa9725e21a nodeName:}" failed. No retries permitted until 2026-04-21 01:52:04.492827488 +0000 UTC m=+104.262789642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tbrdm" (UID: "00c23ae5-f0a8-414b-9e12-1dfa9725e21a") : secret "cluster-monitoring-operator-tls" not found Apr 21 01:51:48.494927 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:48.494908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5f55d0-df55-4f02-98ff-e867d88cad49-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-76shz\" (UID: \"cd5f55d0-df55-4f02-98ff-e867d88cad49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:48.787085 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:48.787016 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" Apr 21 01:51:48.901232 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:48.901201 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz"] Apr 21 01:51:49.126867 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:49.126831 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" event={"ID":"cd5f55d0-df55-4f02-98ff-e867d88cad49","Type":"ContainerStarted","Data":"f352f3eb82c002d8574dfd921025ee5a1cc7837b7f1ad294a43cb4b93110b64e"} Apr 21 01:51:51.133663 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:51.133625 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" event={"ID":"cd5f55d0-df55-4f02-98ff-e867d88cad49","Type":"ContainerStarted","Data":"59252d8cad8fe07b480fa688373ed14eddd334bccc9fd20b7f894c74630e9ccf"} Apr 21 01:51:51.134027 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:51.133667 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" event={"ID":"cd5f55d0-df55-4f02-98ff-e867d88cad49","Type":"ContainerStarted","Data":"41454ff70d3124a5ff91a4d539cbc98261537034cb0e128639a5e0279be83957"} Apr 21 01:51:51.148695 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:51.148637 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-76shz" podStartSLOduration=17.472506182 podStartE2EDuration="19.148622155s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:51:48.939094962 +0000 UTC m=+88.709057118" lastFinishedPulling="2026-04-21 01:51:50.615210925 +0000 UTC m=+90.385173091" observedRunningTime="2026-04-21 01:51:51.147758287 +0000 UTC m=+90.917720532" watchObservedRunningTime="2026-04-21 01:51:51.148622155 +0000 UTC m=+90.918584328" Apr 21 01:51:57.564823 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.564773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:51:57.564823 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.564831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:51:57.567303 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.567277 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82bc4cd8-ea54-4a2a-ae0f-172581c8dace-metrics-tls\") pod \"dns-default-fjfcj\" (UID: \"82bc4cd8-ea54-4a2a-ae0f-172581c8dace\") " pod="openshift-dns/dns-default-fjfcj" Apr 21 01:51:57.567877 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.567854 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d936ad0-666c-49cb-8f95-d04607cc5b52-cert\") pod \"ingress-canary-9zg7k\" (UID: \"7d936ad0-666c-49cb-8f95-d04607cc5b52\") " pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:51:57.772838 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.772810 2568 scope.go:117] "RemoveContainer" containerID="92b5530e79095bf3cf599da4b6c095b76b4f96df2a8c8d668848c6380827359f" Apr 21 01:51:57.805289 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.805268 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vd9s5\"" Apr 21 01:51:57.810873 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.810854 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbqdn\"" Apr 21 01:51:57.813573 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.813556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fjfcj" Apr 21 01:51:57.820155 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.820106 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9zg7k" Apr 21 01:51:57.944159 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.944133 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fjfcj"] Apr 21 01:51:57.946756 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:57.946722 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82bc4cd8_ea54_4a2a_ae0f_172581c8dace.slice/crio-bfe829e528fae27c9c63fea8bf2888aba8a1e35737fbb8859b7172fc8a2d1e98 WatchSource:0}: Error finding container bfe829e528fae27c9c63fea8bf2888aba8a1e35737fbb8859b7172fc8a2d1e98: Status 404 returned error can't find the container with id bfe829e528fae27c9c63fea8bf2888aba8a1e35737fbb8859b7172fc8a2d1e98 Apr 21 01:51:57.964224 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:57.964199 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9zg7k"] Apr 21 01:51:57.967116 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:51:57.967090 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d936ad0_666c_49cb_8f95_d04607cc5b52.slice/crio-67a4df04e69527b9227481a9cca47057b67c1f38ba783d9fc20626599412864f WatchSource:0}: Error finding container 67a4df04e69527b9227481a9cca47057b67c1f38ba783d9fc20626599412864f: Status 404 returned error can't find the container with id 67a4df04e69527b9227481a9cca47057b67c1f38ba783d9fc20626599412864f Apr 21 01:51:58.151878 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.151848 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fjfcj" event={"ID":"82bc4cd8-ea54-4a2a-ae0f-172581c8dace","Type":"ContainerStarted","Data":"bfe829e528fae27c9c63fea8bf2888aba8a1e35737fbb8859b7172fc8a2d1e98"} Apr 21 01:51:58.152902 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.152877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9zg7k" event={"ID":"7d936ad0-666c-49cb-8f95-d04607cc5b52","Type":"ContainerStarted","Data":"67a4df04e69527b9227481a9cca47057b67c1f38ba783d9fc20626599412864f"} Apr 21 01:51:58.156829 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.156809 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 01:51:58.156920 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.156856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" event={"ID":"e91998d5-ea6d-4d46-8984-013ce4758689","Type":"ContainerStarted","Data":"2580845a8990b6764715c24b4f8b61de067c18bf68d825922f5e650a0a749ce8"} Apr 21 01:51:58.157144 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.157129 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:58.824526 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.824429 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" Apr 21 01:51:58.843795 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:51:58.843749 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fx5mm" podStartSLOduration=22.972018754 podStartE2EDuration="26.843733932s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:51:32.96895961 +0000 UTC m=+72.738921762" lastFinishedPulling="2026-04-21 01:51:36.840674787 +0000 UTC m=+76.610636940" observedRunningTime="2026-04-21 01:51:58.170803132 +0000 UTC m=+97.940765308" watchObservedRunningTime="2026-04-21 01:51:58.843733932 +0000 UTC m=+98.613696100" Apr 21 01:52:00.164002 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.163963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9zg7k" event={"ID":"7d936ad0-666c-49cb-8f95-d04607cc5b52","Type":"ContainerStarted","Data":"b9f2b79fa4ca1beab5499ea27fa8745d39b199a012cf54bdd24db438e5ea6e08"} Apr 21 01:52:00.165565 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.165538 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fjfcj" event={"ID":"82bc4cd8-ea54-4a2a-ae0f-172581c8dace","Type":"ContainerStarted","Data":"bcd749999aae9e502c69cf772faf98cb29e6adc9c3694b2556670260b206f70d"} Apr 21 01:52:00.165565 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.165569 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fjfcj" event={"ID":"82bc4cd8-ea54-4a2a-ae0f-172581c8dace","Type":"ContainerStarted","Data":"1474e593433132fa63fb2957646cc9f944ec8970da46d9ca10e6e5e7fd450472"} Apr 21 01:52:00.165760 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.165749 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fjfcj" Apr 21 01:52:00.178303 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.178268 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9zg7k" podStartSLOduration=65.327221607 podStartE2EDuration="1m7.178256363s" podCreationTimestamp="2026-04-21 01:50:53 +0000 UTC" firstStartedPulling="2026-04-21 01:51:57.969123118 +0000 UTC m=+97.739085271" lastFinishedPulling="2026-04-21 01:51:59.820157868 +0000 UTC m=+99.590120027" observedRunningTime="2026-04-21 01:52:00.177914603 +0000 UTC m=+99.947876779" watchObservedRunningTime="2026-04-21 01:52:00.178256363 +0000 UTC m=+99.948218566" Apr 21 01:52:00.192052 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.192015 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fjfcj" podStartSLOduration=65.323854067 podStartE2EDuration="1m7.192005062s" podCreationTimestamp="2026-04-21 01:50:53 +0000 UTC" firstStartedPulling="2026-04-21 01:51:57.949151715 +0000 UTC m=+97.719113870" lastFinishedPulling="2026-04-21 01:51:59.817302708 +0000 UTC m=+99.587264865" observedRunningTime="2026-04-21 01:52:00.191567926 +0000 UTC m=+99.961530102" watchObservedRunningTime="2026-04-21 01:52:00.192005062 +0000 UTC m=+99.961967236" Apr 21 01:52:00.586786 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.586755 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4"] Apr 21 01:52:00.589718 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.589700 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.591941 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.591920 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 01:52:00.592062 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.591942 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-sxn8p\"" Apr 21 01:52:00.592110 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.592098 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 01:52:00.599254 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.599197 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pbql6"] Apr 21 01:52:00.602142 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.602127 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4"] Apr 21 01:52:00.602227 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.602217 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.604197 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.604180 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 01:52:00.604292 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.604210 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 01:52:00.604292 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.604221 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-q47kz\"" Apr 21 01:52:00.611337 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.611317 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pbql6"] Apr 21 01:52:00.686114 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.686085 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-hjxmg"] Apr 21 01:52:00.689252 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.689228 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:00.691742 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.691723 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ffrgr\"" Apr 21 01:52:00.692175 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692158 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 01:52:00.692316 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692209 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 01:52:00.692398 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5c53bee-bb5b-4e22-8b9a-eb988c725638-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cjcd4\" (UID: \"b5c53bee-bb5b-4e22-8b9a-eb988c725638\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.692450 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b5c53bee-bb5b-4e22-8b9a-eb988c725638-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cjcd4\" (UID: \"b5c53bee-bb5b-4e22-8b9a-eb988c725638\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.692450 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c4e2249-a98a-4f0c-b7c6-84207d6db519-crio-socket\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.692524 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c4e2249-a98a-4f0c-b7c6-84207d6db519-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.692568 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692552 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c4e2249-a98a-4f0c-b7c6-84207d6db519-data-volume\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.692600 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692587 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrr7\" (UniqueName: \"kubernetes.io/projected/7c4e2249-a98a-4f0c-b7c6-84207d6db519-kube-api-access-ctrr7\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.692640 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.692621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c4e2249-a98a-4f0c-b7c6-84207d6db519-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.701064 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.701043 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hjxmg"] Apr 21 01:52:00.707663 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.707646 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8654d455dc-jnl96"] Apr 21 01:52:00.710557 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.710542 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.712694 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.712666 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 01:52:00.712694 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.712690 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 01:52:00.712830 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.712716 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 01:52:00.712830 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.712797 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdqjq\"" Apr 21 01:52:00.718806 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.718785 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 01:52:00.720668 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.720645 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8654d455dc-jnl96"] Apr 21 01:52:00.793879 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.793847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-registry-tls\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.793879 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.793880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-registry-certificates\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794122 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.793906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c4e2249-a98a-4f0c-b7c6-84207d6db519-data-volume\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794122 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.793962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-image-registry-private-configuration\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794122 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794017 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrr7\" (UniqueName: \"kubernetes.io/projected/7c4e2249-a98a-4f0c-b7c6-84207d6db519-kube-api-access-ctrr7\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794122 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794047 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c4e2249-a98a-4f0c-b7c6-84207d6db519-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794122 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794075 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-installation-pull-secrets\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5r65\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-kube-api-access-s5r65\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794161 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c4e2249-a98a-4f0c-b7c6-84207d6db519-data-volume\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794177 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5c53bee-bb5b-4e22-8b9a-eb988c725638-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cjcd4\" (UID: \"b5c53bee-bb5b-4e22-8b9a-eb988c725638\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b5c53bee-bb5b-4e22-8b9a-eb988c725638-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cjcd4\" (UID: \"b5c53bee-bb5b-4e22-8b9a-eb988c725638\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-trusted-ca\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794264 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c4e2249-a98a-4f0c-b7c6-84207d6db519-crio-socket\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794294 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-ca-trust-extracted\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794335 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c4e2249-a98a-4f0c-b7c6-84207d6db519-crio-socket\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c4e2249-a98a-4f0c-b7c6-84207d6db519-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794716 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prtsr\" (UniqueName: \"kubernetes.io/projected/98d2bf0f-70f2-458d-8888-7120eb19ed23-kube-api-access-prtsr\") pod \"downloads-6bcc868b7-hjxmg\" (UID: \"98d2bf0f-70f2-458d-8888-7120eb19ed23\") " pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:00.794716 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-bound-sa-token\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.794801 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c4e2249-a98a-4f0c-b7c6-84207d6db519-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.794956 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.794933 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b5c53bee-bb5b-4e22-8b9a-eb988c725638-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cjcd4\" (UID: \"b5c53bee-bb5b-4e22-8b9a-eb988c725638\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.796403 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.796376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c4e2249-a98a-4f0c-b7c6-84207d6db519-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.796609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.796593 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5c53bee-bb5b-4e22-8b9a-eb988c725638-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cjcd4\" (UID: \"b5c53bee-bb5b-4e22-8b9a-eb988c725638\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.804475 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.804452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrr7\" (UniqueName: \"kubernetes.io/projected/7c4e2249-a98a-4f0c-b7c6-84207d6db519-kube-api-access-ctrr7\") pod \"insights-runtime-extractor-pbql6\" (UID: \"7c4e2249-a98a-4f0c-b7c6-84207d6db519\") " pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.894768 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-trusted-ca\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.894768 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-ca-trust-extracted\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.894768 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prtsr\" (UniqueName: \"kubernetes.io/projected/98d2bf0f-70f2-458d-8888-7120eb19ed23-kube-api-access-prtsr\") pod \"downloads-6bcc868b7-hjxmg\" (UID: \"98d2bf0f-70f2-458d-8888-7120eb19ed23\") " pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:00.895045 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894783 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-bound-sa-token\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895045 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-registry-tls\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895045 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-registry-certificates\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895045 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-image-registry-private-configuration\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895045 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.894907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-installation-pull-secrets\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895765 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.895737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-trusted-ca\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895765 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.895752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-registry-certificates\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895930 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.895796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-ca-trust-extracted\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.895930 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.895892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5r65\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-kube-api-access-s5r65\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.897542 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.897518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-registry-tls\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.897633 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.897562 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-image-registry-private-configuration\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.897748 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.897727 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-installation-pull-secrets\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.897987 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.897974 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" Apr 21 01:52:00.903496 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.903475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-bound-sa-token\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.903728 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.903707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prtsr\" (UniqueName: \"kubernetes.io/projected/98d2bf0f-70f2-458d-8888-7120eb19ed23-kube-api-access-prtsr\") pod \"downloads-6bcc868b7-hjxmg\" (UID: \"98d2bf0f-70f2-458d-8888-7120eb19ed23\") " pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:00.904136 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.904118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5r65\" (UniqueName: \"kubernetes.io/projected/5f58d8d6-b76a-47fa-b63b-84a55e51e3e8-kube-api-access-s5r65\") pod \"image-registry-8654d455dc-jnl96\" (UID: \"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8\") " pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:00.911013 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.910992 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pbql6" Apr 21 01:52:00.998325 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:00.998287 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:01.019975 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.019941 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:01.034175 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.034148 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4"] Apr 21 01:52:01.036932 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:01.036901 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c53bee_bb5b_4e22_8b9a_eb988c725638.slice/crio-474352c23ae800122e43f9e21fb4bf7217ce92b35911b04a17228ad319262b27 WatchSource:0}: Error finding container 474352c23ae800122e43f9e21fb4bf7217ce92b35911b04a17228ad319262b27: Status 404 returned error can't find the container with id 474352c23ae800122e43f9e21fb4bf7217ce92b35911b04a17228ad319262b27 Apr 21 01:52:01.053516 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.052475 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pbql6"] Apr 21 01:52:01.059019 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:01.058980 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c4e2249_a98a_4f0c_b7c6_84207d6db519.slice/crio-a40c80e5531c7bee76bda89cc10d8edc11323cc286fb81d8f2b51bcfd9968859 WatchSource:0}: Error finding container a40c80e5531c7bee76bda89cc10d8edc11323cc286fb81d8f2b51bcfd9968859: Status 404 returned error can't find the container with id a40c80e5531c7bee76bda89cc10d8edc11323cc286fb81d8f2b51bcfd9968859 Apr 21 01:52:01.132160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.132091 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hjxmg"] Apr 21 01:52:01.135178 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:01.135148 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98d2bf0f_70f2_458d_8888_7120eb19ed23.slice/crio-ff00d4f90e41b7aebef3e568e135f9afb6e5c94f51cf816b0881ac18e79fad11 WatchSource:0}: Error finding container ff00d4f90e41b7aebef3e568e135f9afb6e5c94f51cf816b0881ac18e79fad11: Status 404 returned error can't find the container with id ff00d4f90e41b7aebef3e568e135f9afb6e5c94f51cf816b0881ac18e79fad11 Apr 21 01:52:01.157543 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.157499 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8654d455dc-jnl96"] Apr 21 01:52:01.185551 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.185527 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" event={"ID":"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8","Type":"ContainerStarted","Data":"d99f65e43692edfef61569466b3c3e5d8b563c78ac4f4703cff8b55ccf8a3c08"} Apr 21 01:52:01.186579 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.186493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hjxmg" event={"ID":"98d2bf0f-70f2-458d-8888-7120eb19ed23","Type":"ContainerStarted","Data":"ff00d4f90e41b7aebef3e568e135f9afb6e5c94f51cf816b0881ac18e79fad11"} Apr 21 01:52:01.187743 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.187724 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbql6" event={"ID":"7c4e2249-a98a-4f0c-b7c6-84207d6db519","Type":"ContainerStarted","Data":"6a5cf80d72208b39e419dacd27eaeed90f83a72d90a2293ddc1ea5e16919c939"} Apr 21 01:52:01.187820 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.187751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbql6" event={"ID":"7c4e2249-a98a-4f0c-b7c6-84207d6db519","Type":"ContainerStarted","Data":"a40c80e5531c7bee76bda89cc10d8edc11323cc286fb81d8f2b51bcfd9968859"} Apr 21 01:52:01.188736 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:01.188713 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" event={"ID":"b5c53bee-bb5b-4e22-8b9a-eb988c725638","Type":"ContainerStarted","Data":"474352c23ae800122e43f9e21fb4bf7217ce92b35911b04a17228ad319262b27"} Apr 21 01:52:02.193482 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:02.193427 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" event={"ID":"5f58d8d6-b76a-47fa-b63b-84a55e51e3e8","Type":"ContainerStarted","Data":"3b0ef55275a2b16ae65be544e6f7e765f6719c87168cbd151597bcb7a7d3ddf2"} Apr 21 01:52:02.193877 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:02.193539 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:02.195173 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:02.195142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbql6" event={"ID":"7c4e2249-a98a-4f0c-b7c6-84207d6db519","Type":"ContainerStarted","Data":"b2fcbcdb6574f0c06f6233ea7ed7d8b9d4c2c9a7d412fb5a61d5a67f37e67d1f"} Apr 21 01:52:02.196689 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:02.196661 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" event={"ID":"b5c53bee-bb5b-4e22-8b9a-eb988c725638","Type":"ContainerStarted","Data":"9e9b75fedaef1e4d7e0fdac3ab26771e51d2e9e4ec515547d788abadc5b0e251"} Apr 21 01:52:02.222070 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:02.221971 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" podStartSLOduration=2.22195554 podStartE2EDuration="2.22195554s" podCreationTimestamp="2026-04-21 01:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:52:02.221683158 +0000 UTC m=+101.991645334" watchObservedRunningTime="2026-04-21 01:52:02.22195554 +0000 UTC m=+101.991917693" Apr 21 01:52:02.244834 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:02.244716 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cjcd4" podStartSLOduration=1.27513507 podStartE2EDuration="2.244701338s" podCreationTimestamp="2026-04-21 01:52:00 +0000 UTC" firstStartedPulling="2026-04-21 01:52:01.039114633 +0000 UTC m=+100.809076807" lastFinishedPulling="2026-04-21 01:52:02.008680911 +0000 UTC m=+101.778643075" observedRunningTime="2026-04-21 01:52:02.243484527 +0000 UTC m=+102.013446702" watchObservedRunningTime="2026-04-21 01:52:02.244701338 +0000 UTC m=+102.014663514" Apr 21 01:52:04.204083 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:04.204042 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pbql6" event={"ID":"7c4e2249-a98a-4f0c-b7c6-84207d6db519","Type":"ContainerStarted","Data":"d326e5132dff9343c0bbc52ee01338ded656e4e967cfb74c6b5d54a918009d7f"} Apr 21 01:52:04.221156 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:04.221106 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pbql6" podStartSLOduration=1.839285996 podStartE2EDuration="4.221091164s" podCreationTimestamp="2026-04-21 01:52:00 +0000 UTC" firstStartedPulling="2026-04-21 01:52:01.127443486 +0000 UTC m=+100.897405651" lastFinishedPulling="2026-04-21 01:52:03.509248651 +0000 UTC m=+103.279210819" observedRunningTime="2026-04-21 01:52:04.219360952 +0000 UTC m=+103.989323124" watchObservedRunningTime="2026-04-21 01:52:04.221091164 +0000 UTC m=+103.991053338" Apr 21 01:52:04.529289 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:04.529191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:52:04.532042 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:04.532011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/00c23ae5-f0a8-414b-9e12-1dfa9725e21a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tbrdm\" (UID: \"00c23ae5-f0a8-414b-9e12-1dfa9725e21a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:52:04.691779 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:04.691739 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" Apr 21 01:52:04.821589 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:04.821510 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm"] Apr 21 01:52:04.827227 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:04.827196 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00c23ae5_f0a8_414b_9e12_1dfa9725e21a.slice/crio-aa8a1ce5eb5f7d9eeb6b9cef23b8c44b4e5bfa61bc8e0b95d9890368bdf2bffa WatchSource:0}: Error finding container aa8a1ce5eb5f7d9eeb6b9cef23b8c44b4e5bfa61bc8e0b95d9890368bdf2bffa: Status 404 returned error can't find the container with id aa8a1ce5eb5f7d9eeb6b9cef23b8c44b4e5bfa61bc8e0b95d9890368bdf2bffa Apr 21 01:52:05.207896 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:05.207846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" event={"ID":"00c23ae5-f0a8-414b-9e12-1dfa9725e21a","Type":"ContainerStarted","Data":"aa8a1ce5eb5f7d9eeb6b9cef23b8c44b4e5bfa61bc8e0b95d9890368bdf2bffa"} Apr 21 01:52:07.215013 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:07.214972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" event={"ID":"00c23ae5-f0a8-414b-9e12-1dfa9725e21a","Type":"ContainerStarted","Data":"22823ef4a71243555b5cdfa25a4a3c245137dfbe31e8f3967449112f52a5c3a8"} Apr 21 01:52:07.229368 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:07.229297 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tbrdm" podStartSLOduration=33.224206505 podStartE2EDuration="35.229280771s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="2026-04-21 01:52:04.829618685 +0000 UTC m=+104.599580838" lastFinishedPulling="2026-04-21 01:52:06.834692933 +0000 UTC m=+106.604655104" observedRunningTime="2026-04-21 01:52:07.228435075 +0000 UTC m=+106.998397250" watchObservedRunningTime="2026-04-21 01:52:07.229280771 +0000 UTC m=+106.999242947" Apr 21 01:52:10.191041 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:10.190998 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fjfcj" Apr 21 01:52:16.736289 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.736150 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ktwrd"] Apr 21 01:52:16.739714 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.739692 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.741999 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.741912 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 01:52:16.742145 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.742102 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 01:52:16.742215 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.742156 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2rf2h\"" Apr 21 01:52:16.742277 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.742156 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 01:52:16.743010 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.742991 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 01:52:16.837210 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837171 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-metrics-client-ca\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837415 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837218 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-wtmp\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837415 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837415 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837370 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbczr\" (UniqueName: \"kubernetes.io/projected/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-kube-api-access-sbczr\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837575 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837421 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-tls\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837575 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-root\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837575 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837723 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837578 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-textfile\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.837723 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.837624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-sys\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938359 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938334 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-metrics-client-ca\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938470 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-wtmp\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938470 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938399 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938470 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbczr\" (UniqueName: \"kubernetes.io/projected/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-kube-api-access-sbczr\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938470 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-tls\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938664 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-root\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938664 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938664 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-wtmp\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938664 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-root\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938875 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-textfile\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.938875 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-sys\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.939054 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.938934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-sys\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.939054 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.939003 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-metrics-client-ca\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.939151 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.939088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-textfile\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.939151 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.939100 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.940804 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.940776 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.940947 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.940930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-node-exporter-tls\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:16.946846 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:16.946823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbczr\" (UniqueName: \"kubernetes.io/projected/a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1-kube-api-access-sbczr\") pod \"node-exporter-ktwrd\" (UID: \"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1\") " pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:17.051153 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:17.050685 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ktwrd" Apr 21 01:52:17.061557 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:17.061515 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda31f1fd7_9a55_4c9b_8bdf_c09c4290b0c1.slice/crio-1495572e741d443503112ff4bbad87a547e5fb2a1931427fcabd001bca345e21 WatchSource:0}: Error finding container 1495572e741d443503112ff4bbad87a547e5fb2a1931427fcabd001bca345e21: Status 404 returned error can't find the container with id 1495572e741d443503112ff4bbad87a547e5fb2a1931427fcabd001bca345e21 Apr 21 01:52:17.245452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:17.245411 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hjxmg" event={"ID":"98d2bf0f-70f2-458d-8888-7120eb19ed23","Type":"ContainerStarted","Data":"e577d86c2c83b5aa155b46a6c67ed17038f8cce9b093e9d3d8c983d67437e99b"} Apr 21 01:52:17.245663 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:17.245642 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:17.246708 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:17.246674 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ktwrd" event={"ID":"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1","Type":"ContainerStarted","Data":"1495572e741d443503112ff4bbad87a547e5fb2a1931427fcabd001bca345e21"} Apr 21 01:52:17.260991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:17.260908 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-hjxmg" Apr 21 01:52:17.262177 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:17.262122 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-hjxmg" podStartSLOduration=1.466161028 podStartE2EDuration="17.262105538s" podCreationTimestamp="2026-04-21 01:52:00 +0000 UTC" firstStartedPulling="2026-04-21 01:52:01.137252352 +0000 UTC m=+100.907214505" lastFinishedPulling="2026-04-21 01:52:16.933196859 +0000 UTC m=+116.703159015" observedRunningTime="2026-04-21 01:52:17.261376198 +0000 UTC m=+117.031338366" watchObservedRunningTime="2026-04-21 01:52:17.262105538 +0000 UTC m=+117.032067716" Apr 21 01:52:18.252031 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:18.251998 2568 generic.go:358] "Generic (PLEG): container finished" podID="a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1" containerID="c51f7ab37d7dd29d233fa808f3247508fc993b54c397f0794735b3ad6aa3ed3d" exitCode=0 Apr 21 01:52:18.252495 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:18.252079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ktwrd" event={"ID":"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1","Type":"ContainerDied","Data":"c51f7ab37d7dd29d233fa808f3247508fc993b54c397f0794735b3ad6aa3ed3d"} Apr 21 01:52:19.257918 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:19.257880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ktwrd" event={"ID":"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1","Type":"ContainerStarted","Data":"192eb11ed9f1a2be3795a5e3a39f739c862c292f655e51b5b4f729e967e3d4b4"} Apr 21 01:52:19.258407 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:19.257929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ktwrd" event={"ID":"a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1","Type":"ContainerStarted","Data":"c0e9975e55bea16e0d160f26c73e11459d7e1c58b412d8aeb79aa5f7a34f0310"} Apr 21 01:52:19.278441 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:19.278383 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ktwrd" podStartSLOduration=2.596415994 podStartE2EDuration="3.278364228s" podCreationTimestamp="2026-04-21 01:52:16 +0000 UTC" firstStartedPulling="2026-04-21 01:52:17.063533237 +0000 UTC m=+116.833495390" lastFinishedPulling="2026-04-21 01:52:17.74548146 +0000 UTC m=+117.515443624" observedRunningTime="2026-04-21 01:52:19.277733311 +0000 UTC m=+119.047695489" watchObservedRunningTime="2026-04-21 01:52:19.278364228 +0000 UTC m=+119.048326418" Apr 21 01:52:21.511085 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.511050 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q"] Apr 21 01:52:21.535847 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.535812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q"] Apr 21 01:52:21.536008 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.535950 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:21.538078 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.538046 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-nsbz2\"" Apr 21 01:52:21.538238 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.538157 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 01:52:21.579727 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.579692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4dd276dd-7cf5-4018-bea5-741cfedf9db9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gwb4q\" (UID: \"4dd276dd-7cf5-4018-bea5-741cfedf9db9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:21.680338 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:21.680283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4dd276dd-7cf5-4018-bea5-741cfedf9db9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gwb4q\" (UID: \"4dd276dd-7cf5-4018-bea5-741cfedf9db9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:21.680513 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:52:21.680419 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 01:52:21.680513 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:52:21.680490 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd276dd-7cf5-4018-bea5-741cfedf9db9-monitoring-plugin-cert podName:4dd276dd-7cf5-4018-bea5-741cfedf9db9 nodeName:}" failed. No retries permitted until 2026-04-21 01:52:22.180470775 +0000 UTC m=+121.950432931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/4dd276dd-7cf5-4018-bea5-741cfedf9db9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-gwb4q" (UID: "4dd276dd-7cf5-4018-bea5-741cfedf9db9") : secret "monitoring-plugin-cert" not found Apr 21 01:52:22.184750 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.184716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4dd276dd-7cf5-4018-bea5-741cfedf9db9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gwb4q\" (UID: \"4dd276dd-7cf5-4018-bea5-741cfedf9db9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:22.187525 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.187496 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4dd276dd-7cf5-4018-bea5-741cfedf9db9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gwb4q\" (UID: \"4dd276dd-7cf5-4018-bea5-741cfedf9db9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:22.446889 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.446803 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:22.587877 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.587846 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q"] Apr 21 01:52:22.591213 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:22.591185 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd276dd_7cf5_4018_bea5_741cfedf9db9.slice/crio-c96a3fb464181d1a194df2869b2f86db64f6c2e081483194bab27f14f24daf49 WatchSource:0}: Error finding container c96a3fb464181d1a194df2869b2f86db64f6c2e081483194bab27f14f24daf49: Status 404 returned error can't find the container with id c96a3fb464181d1a194df2869b2f86db64f6c2e081483194bab27f14f24daf49 Apr 21 01:52:22.965154 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.965116 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 01:52:22.983104 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.983072 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.986671 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.986643 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 01:52:22.988452 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988430 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 01:52:22.988628 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988611 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 01:52:22.988719 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988690 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 01:52:22.988719 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988700 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 01:52:22.988839 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988699 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 01:52:22.988839 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988818 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 01:52:22.988941 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988817 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 01:52:22.988941 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988903 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 01:52:22.989036 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988972 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8qmtj\"" Apr 21 01:52:22.989036 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.988979 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 01:52:22.989131 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.989038 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3hso0acdbp7gp\"" Apr 21 01:52:22.989131 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.989079 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 01:52:22.990563 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990683 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990570 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990683 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990600 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990683 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990683 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990677 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 01:52:22.990890 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990890 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990890 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.990890 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991051 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991051 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.990974 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991051 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991009 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsz9\" (UniqueName: \"kubernetes.io/projected/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-kube-api-access-xxsz9\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991051 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991037 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-config\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991120 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991180 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.991272 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.991203 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:22.992029 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.992005 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 01:52:23.001825 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:22.999289 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 01:52:23.092475 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092475 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsz9\" (UniqueName: \"kubernetes.io/projected/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-kube-api-access-xxsz9\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092537 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-config\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.092713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092715 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092887 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093095 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.092972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093842 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.093813 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.093842 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.093864 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.094228 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.094206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.096127 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.096033 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.097101 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.097043 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.097536 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.097511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.098247 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.098159 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.098849 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.098815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.100198 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.099705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.100198 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.100157 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.100922 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.100876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.101294 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.101268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-config\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.101914 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.101893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.102750 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.102719 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.102750 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.102732 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.102896 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.102752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.103906 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.103883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.115743 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.115717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsz9\" (UniqueName: \"kubernetes.io/projected/19cd4ebd-4136-4d50-af9b-5f703d01d7c8-kube-api-access-xxsz9\") pod \"prometheus-k8s-0\" (UID: \"19cd4ebd-4136-4d50-af9b-5f703d01d7c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.206143 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.205866 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8654d455dc-jnl96" Apr 21 01:52:23.271133 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.271047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" event={"ID":"4dd276dd-7cf5-4018-bea5-741cfedf9db9","Type":"ContainerStarted","Data":"c96a3fb464181d1a194df2869b2f86db64f6c2e081483194bab27f14f24daf49"} Apr 21 01:52:23.296481 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.296444 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:23.457259 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:23.457224 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 01:52:23.462845 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:23.462814 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cd4ebd_4136_4d50_af9b_5f703d01d7c8.slice/crio-8a28c8979abf07242e94c9a235c15f9f2291146693409b327b21ffb8fb946729 WatchSource:0}: Error finding container 8a28c8979abf07242e94c9a235c15f9f2291146693409b327b21ffb8fb946729: Status 404 returned error can't find the container with id 8a28c8979abf07242e94c9a235c15f9f2291146693409b327b21ffb8fb946729 Apr 21 01:52:24.275214 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:24.275179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"8a28c8979abf07242e94c9a235c15f9f2291146693409b327b21ffb8fb946729"} Apr 21 01:52:25.282077 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:25.282039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" event={"ID":"4dd276dd-7cf5-4018-bea5-741cfedf9db9","Type":"ContainerStarted","Data":"118992c1e8d5b647480d43212bd2079b93c07822ac765664cb5e476e0ad19023"} Apr 21 01:52:25.282546 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:25.282273 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:25.287992 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:25.287969 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" Apr 21 01:52:25.296010 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:25.295964 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gwb4q" podStartSLOduration=2.471779939 podStartE2EDuration="4.295947915s" podCreationTimestamp="2026-04-21 01:52:21 +0000 UTC" firstStartedPulling="2026-04-21 01:52:22.593499945 +0000 UTC m=+122.363462102" lastFinishedPulling="2026-04-21 01:52:24.417667906 +0000 UTC m=+124.187630078" observedRunningTime="2026-04-21 01:52:25.295545188 +0000 UTC m=+125.065507366" watchObservedRunningTime="2026-04-21 01:52:25.295947915 +0000 UTC m=+125.065910092" Apr 21 01:52:26.287019 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:26.286978 2568 generic.go:358] "Generic (PLEG): container finished" podID="19cd4ebd-4136-4d50-af9b-5f703d01d7c8" containerID="6cf0ecab6b9df69a5cd542d2fd6fc094485d3febe4a7911000c9172de1217941" exitCode=0 Apr 21 01:52:26.287466 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:26.287070 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerDied","Data":"6cf0ecab6b9df69a5cd542d2fd6fc094485d3febe4a7911000c9172de1217941"} Apr 21 01:52:30.301858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.301817 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"7c1284b47cfb432426ca9fc803704e0b7d1fa0c54c162dba4b420a0fb0c93363"} Apr 21 01:52:30.301858 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.301862 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"090b976799c021b1caf9387e730bfb93070a34d9afd08ae3b137d2101eae858e"} Apr 21 01:52:30.468325 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.468279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:52:30.470989 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.470961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/333616b1-f960-4eb6-b4fd-448534b9cd3a-metrics-certs\") pod \"network-metrics-daemon-pqvmq\" (UID: \"333616b1-f960-4eb6-b4fd-448534b9cd3a\") " pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:52:30.692816 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.692786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68xr4\"" Apr 21 01:52:30.700494 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.700471 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pqvmq" Apr 21 01:52:30.826789 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:30.826759 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pqvmq"] Apr 21 01:52:31.373893 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:52:31.373848 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod333616b1_f960_4eb6_b4fd_448534b9cd3a.slice/crio-53b355485e0fa154bfa59c48f54c73a7bf8d3ca72c7c49593f35c8e5f196f45e WatchSource:0}: Error finding container 53b355485e0fa154bfa59c48f54c73a7bf8d3ca72c7c49593f35c8e5f196f45e: Status 404 returned error can't find the container with id 53b355485e0fa154bfa59c48f54c73a7bf8d3ca72c7c49593f35c8e5f196f45e Apr 21 01:52:32.310804 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:32.310723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"87e195b393a8323503dc7be38cd58f81c6bf90f367eee491889791414cbecf3b"} Apr 21 01:52:32.310804 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:32.310762 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"d9c2255a5933123809ffd1504f5bb1f0fa79a91411da4327113f2df9da7156fa"} Apr 21 01:52:32.310804 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:32.310772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"3aaf53e6f27630b525c23d63444c988f07e9576e259ede80e9a1ace555c2bc0c"} Apr 21 01:52:32.310804 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:32.310780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"19cd4ebd-4136-4d50-af9b-5f703d01d7c8","Type":"ContainerStarted","Data":"a346f403c31f3f2aacb92b013fcab59dd94302ae43fd4deb57d9793e34bef427"} Apr 21 01:52:32.311863 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:32.311839 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pqvmq" event={"ID":"333616b1-f960-4eb6-b4fd-448534b9cd3a","Type":"ContainerStarted","Data":"53b355485e0fa154bfa59c48f54c73a7bf8d3ca72c7c49593f35c8e5f196f45e"} Apr 21 01:52:32.337722 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:32.337669 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.065913591 podStartE2EDuration="10.33765437s" podCreationTimestamp="2026-04-21 01:52:22 +0000 UTC" firstStartedPulling="2026-04-21 01:52:23.466039707 +0000 UTC m=+123.236001863" lastFinishedPulling="2026-04-21 01:52:31.737780475 +0000 UTC m=+131.507742642" observedRunningTime="2026-04-21 01:52:32.335036113 +0000 UTC m=+132.104998306" watchObservedRunningTime="2026-04-21 01:52:32.33765437 +0000 UTC m=+132.107616544" Apr 21 01:52:33.297160 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:33.297129 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:52:33.316843 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:33.316812 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pqvmq" event={"ID":"333616b1-f960-4eb6-b4fd-448534b9cd3a","Type":"ContainerStarted","Data":"55841e863f2bb0303674d7f4bf5860fa8ac3edab6e0b3f897fe3508b05efeb53"} Apr 21 01:52:33.316843 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:33.316843 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pqvmq" event={"ID":"333616b1-f960-4eb6-b4fd-448534b9cd3a","Type":"ContainerStarted","Data":"f3d0f217a4b543fc00042d2c99e72f310884972c8197b83044dff1be3b401749"} Apr 21 01:52:33.332071 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:33.332029 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pqvmq" podStartSLOduration=131.782678672 podStartE2EDuration="2m13.332017346s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:52:31.375884618 +0000 UTC m=+131.145846774" lastFinishedPulling="2026-04-21 01:52:32.925223295 +0000 UTC m=+132.695185448" observedRunningTime="2026-04-21 01:52:33.329398406 +0000 UTC m=+133.099360582" watchObservedRunningTime="2026-04-21 01:52:33.332017346 +0000 UTC m=+133.101979521" Apr 21 01:52:43.347865 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:43.347830 2568 generic.go:358] "Generic (PLEG): container finished" podID="35c0ff07-3052-4608-8a7c-4b86babf4ea2" containerID="996dfab57e96cb6d279aa581157f815a5980cf2aadda8610e6e1c9d2a9610f95" exitCode=0 Apr 21 01:52:43.348271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:43.347892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" event={"ID":"35c0ff07-3052-4608-8a7c-4b86babf4ea2","Type":"ContainerDied","Data":"996dfab57e96cb6d279aa581157f815a5980cf2aadda8610e6e1c9d2a9610f95"} Apr 21 01:52:43.348271 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:43.348219 2568 scope.go:117] "RemoveContainer" containerID="996dfab57e96cb6d279aa581157f815a5980cf2aadda8610e6e1c9d2a9610f95" Apr 21 01:52:44.158209 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:44.158182 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9zg7k_7d936ad0-666c-49cb-8f95-d04607cc5b52/serve-healthcheck-canary/0.log" Apr 21 01:52:44.352722 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:52:44.352680 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-cqf6k" event={"ID":"35c0ff07-3052-4608-8a7c-4b86babf4ea2","Type":"ContainerStarted","Data":"33ff572e40e99cd3b7d815b785211162059c4011b85aa06f658fa3bc0b8a4534"} Apr 21 01:53:03.413400 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:03.413298 2568 generic.go:358] "Generic (PLEG): container finished" podID="fca9b761-dba2-40df-99e6-41e04c0a7ffb" containerID="9ed4de03891cd87486173fb0973d8bfda0054db30784b3ebf2eb7bbabd15cd2b" exitCode=0 Apr 21 01:53:03.413400 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:03.413371 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" event={"ID":"fca9b761-dba2-40df-99e6-41e04c0a7ffb","Type":"ContainerDied","Data":"9ed4de03891cd87486173fb0973d8bfda0054db30784b3ebf2eb7bbabd15cd2b"} Apr 21 01:53:03.413859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:03.413683 2568 scope.go:117] "RemoveContainer" containerID="9ed4de03891cd87486173fb0973d8bfda0054db30784b3ebf2eb7bbabd15cd2b" Apr 21 01:53:04.417993 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:04.417959 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-75sps" event={"ID":"fca9b761-dba2-40df-99e6-41e04c0a7ffb","Type":"ContainerStarted","Data":"b9ae720f069d57895c8164a80e282327725307e84f52b7cf341e8c127bf819ea"} Apr 21 01:53:08.430069 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:08.430036 2568 generic.go:358] "Generic (PLEG): container finished" podID="890e70ae-a9fe-418c-8f2f-c3e3ba235674" containerID="06af163a64127e84c85b89ddd310c663ba608b7b28f1dfbb2cdb1e44b4729bfe" exitCode=0 Apr 21 01:53:08.430666 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:08.430117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" event={"ID":"890e70ae-a9fe-418c-8f2f-c3e3ba235674","Type":"ContainerDied","Data":"06af163a64127e84c85b89ddd310c663ba608b7b28f1dfbb2cdb1e44b4729bfe"} Apr 21 01:53:08.430666 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:08.430543 2568 scope.go:117] "RemoveContainer" containerID="06af163a64127e84c85b89ddd310c663ba608b7b28f1dfbb2cdb1e44b4729bfe" Apr 21 01:53:09.434971 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:09.434941 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8k7zj" event={"ID":"890e70ae-a9fe-418c-8f2f-c3e3ba235674","Type":"ContainerStarted","Data":"16b46888476ebfe95c3623ba5912216be606ce22cd366661761171ffdd43048b"} Apr 21 01:53:23.297647 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:23.297613 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:53:23.316963 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:23.316937 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:53:23.490704 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:53:23.490677 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 01:54:42.904936 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.904858 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-47fz2"] Apr 21 01:54:42.907904 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.907885 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:42.910798 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.910780 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 01:54:42.915550 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.915527 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-47fz2"] Apr 21 01:54:42.961157 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.961121 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3d6924f9-e30d-4086-95a0-8d525b528e62-dbus\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:42.961332 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.961176 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3d6924f9-e30d-4086-95a0-8d525b528e62-kubelet-config\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:42.961332 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:42.961198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d6924f9-e30d-4086-95a0-8d525b528e62-original-pull-secret\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.062387 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.062352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3d6924f9-e30d-4086-95a0-8d525b528e62-dbus\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.062531 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.062423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3d6924f9-e30d-4086-95a0-8d525b528e62-kubelet-config\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.062531 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.062454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d6924f9-e30d-4086-95a0-8d525b528e62-original-pull-secret\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.062604 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.062542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3d6924f9-e30d-4086-95a0-8d525b528e62-dbus\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.062604 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.062542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3d6924f9-e30d-4086-95a0-8d525b528e62-kubelet-config\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.064819 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.064796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d6924f9-e30d-4086-95a0-8d525b528e62-original-pull-secret\") pod \"global-pull-secret-syncer-47fz2\" (UID: \"3d6924f9-e30d-4086-95a0-8d525b528e62\") " pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.218024 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.217952 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-47fz2" Apr 21 01:54:43.331998 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.331965 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-47fz2"] Apr 21 01:54:43.334782 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:54:43.334750 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6924f9_e30d_4086_95a0_8d525b528e62.slice/crio-9e05fadcfd7f2a0824116b4648e8ec475537eb28cd71b101b59fce2f0418718f WatchSource:0}: Error finding container 9e05fadcfd7f2a0824116b4648e8ec475537eb28cd71b101b59fce2f0418718f: Status 404 returned error can't find the container with id 9e05fadcfd7f2a0824116b4648e8ec475537eb28cd71b101b59fce2f0418718f Apr 21 01:54:43.701251 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:43.701212 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-47fz2" event={"ID":"3d6924f9-e30d-4086-95a0-8d525b528e62","Type":"ContainerStarted","Data":"9e05fadcfd7f2a0824116b4648e8ec475537eb28cd71b101b59fce2f0418718f"} Apr 21 01:54:48.719880 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:48.719842 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-47fz2" event={"ID":"3d6924f9-e30d-4086-95a0-8d525b528e62","Type":"ContainerStarted","Data":"8a50a3541804f210447f8c011eda697eb2bc9fb65766ba0ee61bc7f622bce3ba"} Apr 21 01:54:48.732699 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:54:48.732643 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-47fz2" podStartSLOduration=2.379910348 podStartE2EDuration="6.732626281s" podCreationTimestamp="2026-04-21 01:54:42 +0000 UTC" firstStartedPulling="2026-04-21 01:54:43.33635881 +0000 UTC m=+263.106320963" lastFinishedPulling="2026-04-21 01:54:47.68907474 +0000 UTC m=+267.459036896" observedRunningTime="2026-04-21 01:54:48.732417001 +0000 UTC m=+268.502379177" watchObservedRunningTime="2026-04-21 01:54:48.732626281 +0000 UTC m=+268.502588457" Apr 21 01:55:20.666471 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:20.666442 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 01:55:20.667133 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:20.666682 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 01:55:20.672284 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:20.672263 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:55:20.672430 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:20.672335 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 01:55:20.677120 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:20.677103 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 01:55:47.647215 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.647180 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz"] Apr 21 01:55:47.650476 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.650457 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.652656 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.652634 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:55:47.653472 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.653455 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-jnb8g\"" Apr 21 01:55:47.653568 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.653505 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 01:55:47.658630 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.658604 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz"] Apr 21 01:55:47.787739 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.787706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7bl\" (UniqueName: \"kubernetes.io/projected/07860c32-39e8-4073-8b12-b849eec6c664-kube-api-access-xt7bl\") pod \"openshift-lws-operator-bfc7f696d-8g4rz\" (UID: \"07860c32-39e8-4073-8b12-b849eec6c664\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.787910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.787746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07860c32-39e8-4073-8b12-b849eec6c664-tmp\") pod \"openshift-lws-operator-bfc7f696d-8g4rz\" (UID: \"07860c32-39e8-4073-8b12-b849eec6c664\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.888676 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.888639 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7bl\" (UniqueName: \"kubernetes.io/projected/07860c32-39e8-4073-8b12-b849eec6c664-kube-api-access-xt7bl\") pod \"openshift-lws-operator-bfc7f696d-8g4rz\" (UID: \"07860c32-39e8-4073-8b12-b849eec6c664\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.888860 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.888797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07860c32-39e8-4073-8b12-b849eec6c664-tmp\") pod \"openshift-lws-operator-bfc7f696d-8g4rz\" (UID: \"07860c32-39e8-4073-8b12-b849eec6c664\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.889123 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.889102 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07860c32-39e8-4073-8b12-b849eec6c664-tmp\") pod \"openshift-lws-operator-bfc7f696d-8g4rz\" (UID: \"07860c32-39e8-4073-8b12-b849eec6c664\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.896162 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.896138 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7bl\" (UniqueName: \"kubernetes.io/projected/07860c32-39e8-4073-8b12-b849eec6c664-kube-api-access-xt7bl\") pod \"openshift-lws-operator-bfc7f696d-8g4rz\" (UID: \"07860c32-39e8-4073-8b12-b849eec6c664\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:47.968757 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:47.968678 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" Apr 21 01:55:48.085787 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:48.085754 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz"] Apr 21 01:55:48.088440 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:55:48.088412 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07860c32_39e8_4073_8b12_b849eec6c664.slice/crio-09b02ab63c3da96e55273aa882b1cd6276520d044e097342a24bedbd9f8e3c35 WatchSource:0}: Error finding container 09b02ab63c3da96e55273aa882b1cd6276520d044e097342a24bedbd9f8e3c35: Status 404 returned error can't find the container with id 09b02ab63c3da96e55273aa882b1cd6276520d044e097342a24bedbd9f8e3c35 Apr 21 01:55:48.089952 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:48.089931 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 01:55:48.894495 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:48.894461 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" event={"ID":"07860c32-39e8-4073-8b12-b849eec6c664","Type":"ContainerStarted","Data":"09b02ab63c3da96e55273aa882b1cd6276520d044e097342a24bedbd9f8e3c35"} Apr 21 01:55:52.908427 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:52.908391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" event={"ID":"07860c32-39e8-4073-8b12-b849eec6c664","Type":"ContainerStarted","Data":"d0b48b20ca46e1892fb6393cebd14aaacb4a62946105c3c3ca7429a0e9517eb1"} Apr 21 01:55:52.922789 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:55:52.922743 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8g4rz" podStartSLOduration=2.107654977 podStartE2EDuration="5.922727798s" podCreationTimestamp="2026-04-21 01:55:47 +0000 UTC" firstStartedPulling="2026-04-21 01:55:48.090119508 +0000 UTC m=+327.860081676" lastFinishedPulling="2026-04-21 01:55:51.905192338 +0000 UTC m=+331.675154497" observedRunningTime="2026-04-21 01:55:52.920816544 +0000 UTC m=+332.690778700" watchObservedRunningTime="2026-04-21 01:55:52.922727798 +0000 UTC m=+332.692689972" Apr 21 01:56:08.038850 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.038769 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb"] Apr 21 01:56:08.041936 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.041919 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.044361 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.044337 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 01:56:08.045440 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.045417 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 01:56:08.045539 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.045449 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 01:56:08.045539 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.045461 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-6s4r5\"" Apr 21 01:56:08.053238 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.053209 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb"] Apr 21 01:56:08.149443 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.149409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/931c8473-29e3-410c-8ac1-f093c5626f55-manager-config\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.149611 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.149462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmxrx\" (UniqueName: \"kubernetes.io/projected/931c8473-29e3-410c-8ac1-f093c5626f55-kube-api-access-fmxrx\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.149611 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.149557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/931c8473-29e3-410c-8ac1-f093c5626f55-metrics-cert\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.149682 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.149625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/931c8473-29e3-410c-8ac1-f093c5626f55-cert\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.250023 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.249992 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/931c8473-29e3-410c-8ac1-f093c5626f55-cert\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.250191 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.250037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/931c8473-29e3-410c-8ac1-f093c5626f55-manager-config\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.250191 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.250076 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmxrx\" (UniqueName: \"kubernetes.io/projected/931c8473-29e3-410c-8ac1-f093c5626f55-kube-api-access-fmxrx\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.250191 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.250112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/931c8473-29e3-410c-8ac1-f093c5626f55-metrics-cert\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.250824 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.250796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/931c8473-29e3-410c-8ac1-f093c5626f55-manager-config\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.252516 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.252495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/931c8473-29e3-410c-8ac1-f093c5626f55-cert\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.252609 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.252538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/931c8473-29e3-410c-8ac1-f093c5626f55-metrics-cert\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.257465 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.257442 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmxrx\" (UniqueName: \"kubernetes.io/projected/931c8473-29e3-410c-8ac1-f093c5626f55-kube-api-access-fmxrx\") pod \"lws-controller-manager-5bf8b8945f-c52mb\" (UID: \"931c8473-29e3-410c-8ac1-f093c5626f55\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.352094 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.352062 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:08.471205 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.471175 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb"] Apr 21 01:56:08.474855 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:56:08.474825 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931c8473_29e3_410c_8ac1_f093c5626f55.slice/crio-dfaed986d1593e6ca1ef523c03108e5dedafd7f9008f84f5e7ebcdafd8959d1e WatchSource:0}: Error finding container dfaed986d1593e6ca1ef523c03108e5dedafd7f9008f84f5e7ebcdafd8959d1e: Status 404 returned error can't find the container with id dfaed986d1593e6ca1ef523c03108e5dedafd7f9008f84f5e7ebcdafd8959d1e Apr 21 01:56:08.955739 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:08.955701 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" event={"ID":"931c8473-29e3-410c-8ac1-f093c5626f55","Type":"ContainerStarted","Data":"dfaed986d1593e6ca1ef523c03108e5dedafd7f9008f84f5e7ebcdafd8959d1e"} Apr 21 01:56:10.962090 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:10.962053 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" event={"ID":"931c8473-29e3-410c-8ac1-f093c5626f55","Type":"ContainerStarted","Data":"081f194df9b29f5967d49efcc76739c66529a9eb842c159c5595c4b7c2a9e167"} Apr 21 01:56:10.962435 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:10.962108 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:10.977386 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:10.977342 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" podStartSLOduration=0.591029453 podStartE2EDuration="2.977303237s" podCreationTimestamp="2026-04-21 01:56:08 +0000 UTC" firstStartedPulling="2026-04-21 01:56:08.477175077 +0000 UTC m=+348.247137233" lastFinishedPulling="2026-04-21 01:56:10.863448864 +0000 UTC m=+350.633411017" observedRunningTime="2026-04-21 01:56:10.97642351 +0000 UTC m=+350.746385685" watchObservedRunningTime="2026-04-21 01:56:10.977303237 +0000 UTC m=+350.747265412" Apr 21 01:56:12.325093 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.325060 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468"] Apr 21 01:56:12.327351 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.327330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:12.332795 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:56:12.332769 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"opendatahub-operator-controller-webhook-cert\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"opendatahub\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" type="*v1.Secret" Apr 21 01:56:12.332922 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:56:12.332789 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"opendatahub\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 21 01:56:12.332989 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:56:12.332946 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"opendatahub-operator-controller-manager-service-cert\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"opendatahub\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" type="*v1.Secret" Apr 21 01:56:12.334437 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.334227 2568 status_manager.go:895] "Failed to get status for pod" podUID="228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" err="pods \"opendatahub-operator-controller-manager-64bbc69db5-55468\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"opendatahub\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" Apr 21 01:56:12.334437 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:56:12.334335 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"opendatahub-operator-controller-manager-dockercfg-jjk2t\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"opendatahub\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjk2t\"" type="*v1.Secret" Apr 21 01:56:12.334437 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:56:12.334407 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"opendatahub\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 21 01:56:12.349671 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.349641 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468"] Apr 21 01:56:12.487568 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.487532 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:12.487745 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.487581 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpm7\" (UniqueName: \"kubernetes.io/projected/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-kube-api-access-pvpm7\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:12.487745 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.487633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:12.589104 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.589071 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:12.589245 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.589126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpm7\" (UniqueName: \"kubernetes.io/projected/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-kube-api-access-pvpm7\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:12.589245 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:12.589166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:13.467173 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.467143 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 01:56:13.498931 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.498907 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 01:56:13.504443 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.504426 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 01:56:13.509129 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.509101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpm7\" (UniqueName: \"kubernetes.io/projected/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-kube-api-access-pvpm7\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:13.511714 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.511690 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:13.511824 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.511773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-55468\" (UID: \"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:13.765246 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.765164 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 01:56:13.773869 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.773849 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjk2t\"" Apr 21 01:56:13.779551 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.779537 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:13.907685 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.907547 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468"] Apr 21 01:56:13.909937 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:56:13.909910 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228b1e7c_a37b_4b2d_bdb3_cd860c6e5a95.slice/crio-2134dcaa4558e6aa4fcd6cdc381490a93cd5d78635fd6a4b55c26994b230ac3f WatchSource:0}: Error finding container 2134dcaa4558e6aa4fcd6cdc381490a93cd5d78635fd6a4b55c26994b230ac3f: Status 404 returned error can't find the container with id 2134dcaa4558e6aa4fcd6cdc381490a93cd5d78635fd6a4b55c26994b230ac3f Apr 21 01:56:13.971866 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:13.971836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" event={"ID":"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95","Type":"ContainerStarted","Data":"2134dcaa4558e6aa4fcd6cdc381490a93cd5d78635fd6a4b55c26994b230ac3f"} Apr 21 01:56:16.986225 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:16.986192 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" event={"ID":"228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95","Type":"ContainerStarted","Data":"45ce6dcdbf9e8a3fdd5464821cafb600f3ec07d2cc599e9dd535d6424f57bf0a"} Apr 21 01:56:16.986606 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:16.986320 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:56:17.004367 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:17.004299 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" podStartSLOduration=2.782997609 podStartE2EDuration="5.004283629s" podCreationTimestamp="2026-04-21 01:56:12 +0000 UTC" firstStartedPulling="2026-04-21 01:56:13.911697321 +0000 UTC m=+353.681659478" lastFinishedPulling="2026-04-21 01:56:16.132983342 +0000 UTC m=+355.902945498" observedRunningTime="2026-04-21 01:56:17.002634604 +0000 UTC m=+356.772596778" watchObservedRunningTime="2026-04-21 01:56:17.004283629 +0000 UTC m=+356.774245804" Apr 21 01:56:21.967300 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:21.967256 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-c52mb" Apr 21 01:56:27.992266 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:56:27.992233 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-55468" Apr 21 01:57:00.636399 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.636359 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb"] Apr 21 01:57:00.642871 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.642848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.645141 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.645114 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 01:57:00.645299 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.645273 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-sh72m\"" Apr 21 01:57:00.645299 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.645278 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 01:57:00.645460 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.645323 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 01:57:00.651896 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.651874 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb"] Apr 21 01:57:00.808909 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.808879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d0d0a9bb-7444-42ba-952e-886dda685898-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.808920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d0d0a9bb-7444-42ba-952e-886dda685898-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.808945 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.808967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.808984 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.809008 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fv2f\" (UniqueName: \"kubernetes.io/projected/d0d0a9bb-7444-42ba-952e-886dda685898-kube-api-access-9fv2f\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809086 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.809027 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.809118 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d0d0a9bb-7444-42ba-952e-886dda685898-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.809372 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.809178 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910296 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910206 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d0d0a9bb-7444-42ba-952e-886dda685898-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910296 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910296 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910595 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910595 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fv2f\" (UniqueName: \"kubernetes.io/projected/d0d0a9bb-7444-42ba-952e-886dda685898-kube-api-access-9fv2f\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910595 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910595 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910583 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d0d0a9bb-7444-42ba-952e-886dda685898-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910794 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910794 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d0d0a9bb-7444-42ba-952e-886dda685898-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910794 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.910794 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910782 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.911006 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.910795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.911061 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.911040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.911114 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.911070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d0d0a9bb-7444-42ba-952e-886dda685898-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.912827 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.912800 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d0d0a9bb-7444-42ba-952e-886dda685898-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.913249 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.913233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d0d0a9bb-7444-42ba-952e-886dda685898-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.917438 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.917415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fv2f\" (UniqueName: \"kubernetes.io/projected/d0d0a9bb-7444-42ba-952e-886dda685898-kube-api-access-9fv2f\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.917679 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.917657 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d0d0a9bb-7444-42ba-952e-886dda685898-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb\" (UID: \"d0d0a9bb-7444-42ba-952e-886dda685898\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:00.955517 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:00.955488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:01.086524 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:01.086491 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb"] Apr 21 01:57:01.090058 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:57:01.090031 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d0a9bb_7444_42ba_952e_886dda685898.slice/crio-eda848d7848c025044d5792a96fcd14057614ed6fc990667e9c20be0c2f842ec WatchSource:0}: Error finding container eda848d7848c025044d5792a96fcd14057614ed6fc990667e9c20be0c2f842ec: Status 404 returned error can't find the container with id eda848d7848c025044d5792a96fcd14057614ed6fc990667e9c20be0c2f842ec Apr 21 01:57:01.126048 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:01.126021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" event={"ID":"d0d0a9bb-7444-42ba-952e-886dda685898","Type":"ContainerStarted","Data":"eda848d7848c025044d5792a96fcd14057614ed6fc990667e9c20be0c2f842ec"} Apr 21 01:57:03.836418 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:03.836379 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 01:57:03.836700 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:03.836451 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 01:57:03.836700 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:03.836481 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 01:57:04.138042 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:04.137947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" event={"ID":"d0d0a9bb-7444-42ba-952e-886dda685898","Type":"ContainerStarted","Data":"ec88387f13b0299ac1a89aac04787eb2c660460922e8283e2d6d67c7272b3cb5"} Apr 21 01:57:04.158217 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:04.158165 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" podStartSLOduration=1.4142304430000001 podStartE2EDuration="4.158150892s" podCreationTimestamp="2026-04-21 01:57:00 +0000 UTC" firstStartedPulling="2026-04-21 01:57:01.092191331 +0000 UTC m=+400.862153484" lastFinishedPulling="2026-04-21 01:57:03.836111771 +0000 UTC m=+403.606073933" observedRunningTime="2026-04-21 01:57:04.155543404 +0000 UTC m=+403.925505580" watchObservedRunningTime="2026-04-21 01:57:04.158150892 +0000 UTC m=+403.928113067" Apr 21 01:57:04.955883 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:04.955846 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:04.960583 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:04.960561 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:05.141003 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:05.140975 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:05.141866 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:05.141847 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb" Apr 21 01:57:31.170194 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.170162 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc7tk"] Apr 21 01:57:31.173857 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.173828 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:31.176051 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.176029 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 01:57:31.176142 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.176029 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-4pd8z\"" Apr 21 01:57:31.177043 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.177024 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 01:57:31.179881 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.179856 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc7tk"] Apr 21 01:57:31.274575 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.274526 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kdtz\" (UniqueName: \"kubernetes.io/projected/df15e93c-b2ba-4efd-a1f0-fd93e0808175-kube-api-access-9kdtz\") pod \"kuadrant-operator-catalog-nc7tk\" (UID: \"df15e93c-b2ba-4efd-a1f0-fd93e0808175\") " pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:31.375610 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.375571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kdtz\" (UniqueName: \"kubernetes.io/projected/df15e93c-b2ba-4efd-a1f0-fd93e0808175-kube-api-access-9kdtz\") pod \"kuadrant-operator-catalog-nc7tk\" (UID: \"df15e93c-b2ba-4efd-a1f0-fd93e0808175\") " pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:31.383213 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.383188 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kdtz\" (UniqueName: \"kubernetes.io/projected/df15e93c-b2ba-4efd-a1f0-fd93e0808175-kube-api-access-9kdtz\") pod \"kuadrant-operator-catalog-nc7tk\" (UID: \"df15e93c-b2ba-4efd-a1f0-fd93e0808175\") " pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:31.485178 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.485095 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:31.544564 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.544499 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc7tk"] Apr 21 01:57:31.604297 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.604273 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc7tk"] Apr 21 01:57:31.607529 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:57:31.607495 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf15e93c_b2ba_4efd_a1f0_fd93e0808175.slice/crio-28fba0244eeeb666f5ac25fc8bbb3f2d5965cb14d72b00950f54848e02eb5f16 WatchSource:0}: Error finding container 28fba0244eeeb666f5ac25fc8bbb3f2d5965cb14d72b00950f54848e02eb5f16: Status 404 returned error can't find the container with id 28fba0244eeeb666f5ac25fc8bbb3f2d5965cb14d72b00950f54848e02eb5f16 Apr 21 01:57:31.750904 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.750828 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wwjvh"] Apr 21 01:57:31.753708 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.753692 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:31.762217 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.761024 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wwjvh"] Apr 21 01:57:31.878624 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.878590 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlx5t\" (UniqueName: \"kubernetes.io/projected/1d54a962-a45f-4db5-b8db-7a54953f9142-kube-api-access-wlx5t\") pod \"kuadrant-operator-catalog-wwjvh\" (UID: \"1d54a962-a45f-4db5-b8db-7a54953f9142\") " pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:31.979546 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.979507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlx5t\" (UniqueName: \"kubernetes.io/projected/1d54a962-a45f-4db5-b8db-7a54953f9142-kube-api-access-wlx5t\") pod \"kuadrant-operator-catalog-wwjvh\" (UID: \"1d54a962-a45f-4db5-b8db-7a54953f9142\") " pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:31.986838 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:31.986808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlx5t\" (UniqueName: \"kubernetes.io/projected/1d54a962-a45f-4db5-b8db-7a54953f9142-kube-api-access-wlx5t\") pod \"kuadrant-operator-catalog-wwjvh\" (UID: \"1d54a962-a45f-4db5-b8db-7a54953f9142\") " pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:32.070102 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:32.070024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:32.185106 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:32.185065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wwjvh"] Apr 21 01:57:32.187387 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:57:32.187350 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d54a962_a45f_4db5_b8db_7a54953f9142.slice/crio-8e1aecc758d7cb8cb7befe89dc2366b3ea16a8603fd584ba4c2944928a06fe35 WatchSource:0}: Error finding container 8e1aecc758d7cb8cb7befe89dc2366b3ea16a8603fd584ba4c2944928a06fe35: Status 404 returned error can't find the container with id 8e1aecc758d7cb8cb7befe89dc2366b3ea16a8603fd584ba4c2944928a06fe35 Apr 21 01:57:32.227800 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:32.227766 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" event={"ID":"1d54a962-a45f-4db5-b8db-7a54953f9142","Type":"ContainerStarted","Data":"8e1aecc758d7cb8cb7befe89dc2366b3ea16a8603fd584ba4c2944928a06fe35"} Apr 21 01:57:32.228790 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:32.228765 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" event={"ID":"df15e93c-b2ba-4efd-a1f0-fd93e0808175","Type":"ContainerStarted","Data":"28fba0244eeeb666f5ac25fc8bbb3f2d5965cb14d72b00950f54848e02eb5f16"} Apr 21 01:57:34.239150 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.239063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" event={"ID":"1d54a962-a45f-4db5-b8db-7a54953f9142","Type":"ContainerStarted","Data":"cc00cd9972a893944323855ee8921e3fc46272efe91a069eb5b18483355adb7d"} Apr 21 01:57:34.240377 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.240356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" event={"ID":"df15e93c-b2ba-4efd-a1f0-fd93e0808175","Type":"ContainerStarted","Data":"3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb"} Apr 21 01:57:34.240516 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.240491 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" podUID="df15e93c-b2ba-4efd-a1f0-fd93e0808175" containerName="registry-server" containerID="cri-o://3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb" gracePeriod=2 Apr 21 01:57:34.252623 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.252573 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" podStartSLOduration=1.560479016 podStartE2EDuration="3.252557719s" podCreationTimestamp="2026-04-21 01:57:31 +0000 UTC" firstStartedPulling="2026-04-21 01:57:32.189118761 +0000 UTC m=+431.959080915" lastFinishedPulling="2026-04-21 01:57:33.881197445 +0000 UTC m=+433.651159618" observedRunningTime="2026-04-21 01:57:34.2525529 +0000 UTC m=+434.022515074" watchObservedRunningTime="2026-04-21 01:57:34.252557719 +0000 UTC m=+434.022519894" Apr 21 01:57:34.265136 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.265092 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" podStartSLOduration=0.996114157 podStartE2EDuration="3.265077893s" podCreationTimestamp="2026-04-21 01:57:31 +0000 UTC" firstStartedPulling="2026-04-21 01:57:31.609209662 +0000 UTC m=+431.379171815" lastFinishedPulling="2026-04-21 01:57:33.878173386 +0000 UTC m=+433.648135551" observedRunningTime="2026-04-21 01:57:34.264935259 +0000 UTC m=+434.034897435" watchObservedRunningTime="2026-04-21 01:57:34.265077893 +0000 UTC m=+434.035040068" Apr 21 01:57:34.486685 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.486662 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:34.604778 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.604747 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kdtz\" (UniqueName: \"kubernetes.io/projected/df15e93c-b2ba-4efd-a1f0-fd93e0808175-kube-api-access-9kdtz\") pod \"df15e93c-b2ba-4efd-a1f0-fd93e0808175\" (UID: \"df15e93c-b2ba-4efd-a1f0-fd93e0808175\") " Apr 21 01:57:34.606915 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.606875 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df15e93c-b2ba-4efd-a1f0-fd93e0808175-kube-api-access-9kdtz" (OuterVolumeSpecName: "kube-api-access-9kdtz") pod "df15e93c-b2ba-4efd-a1f0-fd93e0808175" (UID: "df15e93c-b2ba-4efd-a1f0-fd93e0808175"). InnerVolumeSpecName "kube-api-access-9kdtz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:57:34.706148 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:34.706111 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kdtz\" (UniqueName: \"kubernetes.io/projected/df15e93c-b2ba-4efd-a1f0-fd93e0808175-kube-api-access-9kdtz\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 01:57:35.245008 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.244918 2568 generic.go:358] "Generic (PLEG): container finished" podID="df15e93c-b2ba-4efd-a1f0-fd93e0808175" containerID="3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb" exitCode=0 Apr 21 01:57:35.245008 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.244973 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" Apr 21 01:57:35.245493 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.245002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" event={"ID":"df15e93c-b2ba-4efd-a1f0-fd93e0808175","Type":"ContainerDied","Data":"3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb"} Apr 21 01:57:35.245493 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.245055 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nc7tk" event={"ID":"df15e93c-b2ba-4efd-a1f0-fd93e0808175","Type":"ContainerDied","Data":"28fba0244eeeb666f5ac25fc8bbb3f2d5965cb14d72b00950f54848e02eb5f16"} Apr 21 01:57:35.245493 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.245084 2568 scope.go:117] "RemoveContainer" containerID="3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb" Apr 21 01:57:35.253920 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.253903 2568 scope.go:117] "RemoveContainer" containerID="3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb" Apr 21 01:57:35.254142 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:57:35.254124 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb\": container with ID starting with 3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb not found: ID does not exist" containerID="3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb" Apr 21 01:57:35.254189 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.254150 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb"} err="failed to get container status \"3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb\": rpc error: code = NotFound desc = could not find container \"3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb\": container with ID starting with 3bfa1b3e2f062d19f7dc89a5d8e2fcb5de6d46faf3785c78b40c3d05ae6bcefb not found: ID does not exist" Apr 21 01:57:35.260780 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.260758 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc7tk"] Apr 21 01:57:35.262980 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:35.262961 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc7tk"] Apr 21 01:57:36.781520 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:36.781483 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df15e93c-b2ba-4efd-a1f0-fd93e0808175" path="/var/lib/kubelet/pods/df15e93c-b2ba-4efd-a1f0-fd93e0808175/volumes" Apr 21 01:57:42.070865 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:42.070832 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:42.070865 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:42.070867 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:42.092347 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:42.092323 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:57:42.293096 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:57:42.293069 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-wwjvh" Apr 21 01:58:02.777118 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.777089 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh"] Apr 21 01:58:02.777523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.777414 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df15e93c-b2ba-4efd-a1f0-fd93e0808175" containerName="registry-server" Apr 21 01:58:02.777523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.777428 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="df15e93c-b2ba-4efd-a1f0-fd93e0808175" containerName="registry-server" Apr 21 01:58:02.777523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.777481 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="df15e93c-b2ba-4efd-a1f0-fd93e0808175" containerName="registry-server" Apr 21 01:58:02.782389 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.781068 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:02.784531 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.784507 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-pc72f\"" Apr 21 01:58:02.784763 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.784748 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 01:58:02.790859 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.790836 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh"] Apr 21 01:58:02.940249 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:02.940208 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np9b8\" (UniqueName: \"kubernetes.io/projected/fd19c6b1-9354-4bea-bb71-96ba67635021-kube-api-access-np9b8\") pod \"dns-operator-controller-manager-648d5c98bc-t8lwh\" (UID: \"fd19c6b1-9354-4bea-bb71-96ba67635021\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:03.040802 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:03.040702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np9b8\" (UniqueName: \"kubernetes.io/projected/fd19c6b1-9354-4bea-bb71-96ba67635021-kube-api-access-np9b8\") pod \"dns-operator-controller-manager-648d5c98bc-t8lwh\" (UID: \"fd19c6b1-9354-4bea-bb71-96ba67635021\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:03.058281 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:03.058247 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np9b8\" (UniqueName: \"kubernetes.io/projected/fd19c6b1-9354-4bea-bb71-96ba67635021-kube-api-access-np9b8\") pod \"dns-operator-controller-manager-648d5c98bc-t8lwh\" (UID: \"fd19c6b1-9354-4bea-bb71-96ba67635021\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:03.095185 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:03.095154 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:03.212556 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:03.212436 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh"] Apr 21 01:58:03.215577 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:58:03.215549 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd19c6b1_9354_4bea_bb71_96ba67635021.slice/crio-51f069c29200b85fc2baf732a6e9a3c903efb476936020981f4bcd7d95c9ccc3 WatchSource:0}: Error finding container 51f069c29200b85fc2baf732a6e9a3c903efb476936020981f4bcd7d95c9ccc3: Status 404 returned error can't find the container with id 51f069c29200b85fc2baf732a6e9a3c903efb476936020981f4bcd7d95c9ccc3 Apr 21 01:58:03.341384 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:03.341349 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" event={"ID":"fd19c6b1-9354-4bea-bb71-96ba67635021","Type":"ContainerStarted","Data":"51f069c29200b85fc2baf732a6e9a3c903efb476936020981f4bcd7d95c9ccc3"} Apr 21 01:58:06.183013 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.182977 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl"] Apr 21 01:58:06.185421 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.185404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:06.187672 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.187651 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-4hz8x\"" Apr 21 01:58:06.195112 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.195092 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl"] Apr 21 01:58:06.354953 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.354923 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" event={"ID":"fd19c6b1-9354-4bea-bb71-96ba67635021","Type":"ContainerStarted","Data":"6bc1e21a476b70b522d1adb540479decd1bbcf161b83306fc8756bd6b0f16d7a"} Apr 21 01:58:06.355136 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.354999 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:06.373055 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.373030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szkb\" (UniqueName: \"kubernetes.io/projected/2fb0be52-9b4d-46e9-908a-95ffb37cef44-kube-api-access-8szkb\") pod \"limitador-operator-controller-manager-85c4996f8c-59fvl\" (UID: \"2fb0be52-9b4d-46e9-908a-95ffb37cef44\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:06.379062 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.379020 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" podStartSLOduration=2.111257719 podStartE2EDuration="4.379008134s" podCreationTimestamp="2026-04-21 01:58:02 +0000 UTC" firstStartedPulling="2026-04-21 01:58:03.2180065 +0000 UTC m=+462.987968653" lastFinishedPulling="2026-04-21 01:58:05.485756904 +0000 UTC m=+465.255719068" observedRunningTime="2026-04-21 01:58:06.377575084 +0000 UTC m=+466.147537261" watchObservedRunningTime="2026-04-21 01:58:06.379008134 +0000 UTC m=+466.148970307" Apr 21 01:58:06.474024 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.473938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8szkb\" (UniqueName: \"kubernetes.io/projected/2fb0be52-9b4d-46e9-908a-95ffb37cef44-kube-api-access-8szkb\") pod \"limitador-operator-controller-manager-85c4996f8c-59fvl\" (UID: \"2fb0be52-9b4d-46e9-908a-95ffb37cef44\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:06.481495 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.481463 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szkb\" (UniqueName: \"kubernetes.io/projected/2fb0be52-9b4d-46e9-908a-95ffb37cef44-kube-api-access-8szkb\") pod \"limitador-operator-controller-manager-85c4996f8c-59fvl\" (UID: \"2fb0be52-9b4d-46e9-908a-95ffb37cef44\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:06.496243 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.496214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:06.614813 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:06.614793 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl"] Apr 21 01:58:06.617439 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:58:06.617414 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb0be52_9b4d_46e9_908a_95ffb37cef44.slice/crio-0faeeaefacc0e8ebe40cbf469e0383c343ec71a8cf9f5a0d1b4f2635fb96b544 WatchSource:0}: Error finding container 0faeeaefacc0e8ebe40cbf469e0383c343ec71a8cf9f5a0d1b4f2635fb96b544: Status 404 returned error can't find the container with id 0faeeaefacc0e8ebe40cbf469e0383c343ec71a8cf9f5a0d1b4f2635fb96b544 Apr 21 01:58:07.360116 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:07.360078 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" event={"ID":"2fb0be52-9b4d-46e9-908a-95ffb37cef44","Type":"ContainerStarted","Data":"0faeeaefacc0e8ebe40cbf469e0383c343ec71a8cf9f5a0d1b4f2635fb96b544"} Apr 21 01:58:09.369964 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:09.369928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" event={"ID":"2fb0be52-9b4d-46e9-908a-95ffb37cef44","Type":"ContainerStarted","Data":"0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65"} Apr 21 01:58:09.370467 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:09.370034 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:09.385073 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:09.385030 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" podStartSLOduration=1.323493064 podStartE2EDuration="3.385017784s" podCreationTimestamp="2026-04-21 01:58:06 +0000 UTC" firstStartedPulling="2026-04-21 01:58:06.619507524 +0000 UTC m=+466.389469676" lastFinishedPulling="2026-04-21 01:58:08.681032239 +0000 UTC m=+468.450994396" observedRunningTime="2026-04-21 01:58:09.384217406 +0000 UTC m=+469.154179582" watchObservedRunningTime="2026-04-21 01:58:09.385017784 +0000 UTC m=+469.154979960" Apr 21 01:58:14.269098 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.269063 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-lq9cx"] Apr 21 01:58:14.271512 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.271494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:14.275836 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.275815 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-blc5z\"" Apr 21 01:58:14.284262 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.284242 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-lq9cx"] Apr 21 01:58:14.345558 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.345526 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qlh8\" (UniqueName: \"kubernetes.io/projected/2071548b-e9d2-4389-851f-3b7520e5793f-kube-api-access-4qlh8\") pod \"authorino-operator-657f44b778-lq9cx\" (UID: \"2071548b-e9d2-4389-851f-3b7520e5793f\") " pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:14.446395 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.446353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qlh8\" (UniqueName: \"kubernetes.io/projected/2071548b-e9d2-4389-851f-3b7520e5793f-kube-api-access-4qlh8\") pod \"authorino-operator-657f44b778-lq9cx\" (UID: \"2071548b-e9d2-4389-851f-3b7520e5793f\") " pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:14.462331 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.461985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qlh8\" (UniqueName: \"kubernetes.io/projected/2071548b-e9d2-4389-851f-3b7520e5793f-kube-api-access-4qlh8\") pod \"authorino-operator-657f44b778-lq9cx\" (UID: \"2071548b-e9d2-4389-851f-3b7520e5793f\") " pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:14.581688 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.581606 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:14.709513 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:14.709470 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-lq9cx"] Apr 21 01:58:14.711879 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:58:14.711852 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2071548b_e9d2_4389_851f_3b7520e5793f.slice/crio-676fb779060cc5f2f45b2c88a3f68a18cdf42f4641c7ad53841bf1ddf10a2012 WatchSource:0}: Error finding container 676fb779060cc5f2f45b2c88a3f68a18cdf42f4641c7ad53841bf1ddf10a2012: Status 404 returned error can't find the container with id 676fb779060cc5f2f45b2c88a3f68a18cdf42f4641c7ad53841bf1ddf10a2012 Apr 21 01:58:15.390482 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:15.390442 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" event={"ID":"2071548b-e9d2-4389-851f-3b7520e5793f","Type":"ContainerStarted","Data":"676fb779060cc5f2f45b2c88a3f68a18cdf42f4641c7ad53841bf1ddf10a2012"} Apr 21 01:58:17.362857 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:17.362823 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-t8lwh" Apr 21 01:58:17.398947 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:17.398908 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" event={"ID":"2071548b-e9d2-4389-851f-3b7520e5793f","Type":"ContainerStarted","Data":"560ddccf5a8d9e6055f74c7b00ba970d2d04a76e167a9906caa04f844a7ddc12"} Apr 21 01:58:17.399123 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:17.399031 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:17.417816 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:17.417772 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" podStartSLOduration=1.256925441 podStartE2EDuration="3.417757802s" podCreationTimestamp="2026-04-21 01:58:14 +0000 UTC" firstStartedPulling="2026-04-21 01:58:14.713775226 +0000 UTC m=+474.483737379" lastFinishedPulling="2026-04-21 01:58:16.874607573 +0000 UTC m=+476.644569740" observedRunningTime="2026-04-21 01:58:17.416822824 +0000 UTC m=+477.186784999" watchObservedRunningTime="2026-04-21 01:58:17.417757802 +0000 UTC m=+477.187719976" Apr 21 01:58:20.375250 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:20.375217 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:28.404972 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:28.404938 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-lq9cx" Apr 21 01:58:30.941758 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.941728 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl"] Apr 21 01:58:30.942220 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.942069 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" containerName="manager" containerID="cri-o://0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65" gracePeriod=2 Apr 21 01:58:30.952991 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.952958 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl"] Apr 21 01:58:30.963910 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.963886 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl"] Apr 21 01:58:30.964223 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.964210 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" containerName="manager" Apr 21 01:58:30.964223 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.964223 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" containerName="manager" Apr 21 01:58:30.964361 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.964289 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" containerName="manager" Apr 21 01:58:30.966197 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.966179 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:58:30.968096 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.968067 2568 status_manager.go:895] "Failed to get status for pod" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" err="pods \"limitador-operator-controller-manager-85c4996f8c-59fvl\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" Apr 21 01:58:30.976167 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:30.976134 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl"] Apr 21 01:58:31.087412 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.087380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrbt\" (UniqueName: \"kubernetes.io/projected/91a20877-a6ca-4350-a13e-d6cbbba878d2-kube-api-access-wdrbt\") pod \"limitador-operator-controller-manager-85c4996f8c-d4cbl\" (UID: \"91a20877-a6ca-4350-a13e-d6cbbba878d2\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:58:31.165434 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.165408 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:31.167063 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.167037 2568 status_manager.go:895] "Failed to get status for pod" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" err="pods \"limitador-operator-controller-manager-85c4996f8c-59fvl\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" Apr 21 01:58:31.188408 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.188382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrbt\" (UniqueName: \"kubernetes.io/projected/91a20877-a6ca-4350-a13e-d6cbbba878d2-kube-api-access-wdrbt\") pod \"limitador-operator-controller-manager-85c4996f8c-d4cbl\" (UID: \"91a20877-a6ca-4350-a13e-d6cbbba878d2\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:58:31.196172 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.196113 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrbt\" (UniqueName: \"kubernetes.io/projected/91a20877-a6ca-4350-a13e-d6cbbba878d2-kube-api-access-wdrbt\") pod \"limitador-operator-controller-manager-85c4996f8c-d4cbl\" (UID: \"91a20877-a6ca-4350-a13e-d6cbbba878d2\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:58:31.289266 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.289231 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szkb\" (UniqueName: \"kubernetes.io/projected/2fb0be52-9b4d-46e9-908a-95ffb37cef44-kube-api-access-8szkb\") pod \"2fb0be52-9b4d-46e9-908a-95ffb37cef44\" (UID: \"2fb0be52-9b4d-46e9-908a-95ffb37cef44\") " Apr 21 01:58:31.291284 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.291257 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb0be52-9b4d-46e9-908a-95ffb37cef44-kube-api-access-8szkb" (OuterVolumeSpecName: "kube-api-access-8szkb") pod "2fb0be52-9b4d-46e9-908a-95ffb37cef44" (UID: "2fb0be52-9b4d-46e9-908a-95ffb37cef44"). InnerVolumeSpecName "kube-api-access-8szkb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:58:31.319424 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.319396 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:58:31.390614 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.390585 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8szkb\" (UniqueName: \"kubernetes.io/projected/2fb0be52-9b4d-46e9-908a-95ffb37cef44-kube-api-access-8szkb\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 01:58:31.442523 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.442497 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl"] Apr 21 01:58:31.444689 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:58:31.444664 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a20877_a6ca_4350_a13e_d6cbbba878d2.slice/crio-95734c002a76189c807b200424a17bd93ba3b573bbb0c24dd0d694f6d2b264b0 WatchSource:0}: Error finding container 95734c002a76189c807b200424a17bd93ba3b573bbb0c24dd0d694f6d2b264b0: Status 404 returned error can't find the container with id 95734c002a76189c807b200424a17bd93ba3b573bbb0c24dd0d694f6d2b264b0 Apr 21 01:58:31.448070 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.448046 2568 generic.go:358] "Generic (PLEG): container finished" podID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" containerID="0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65" exitCode=0 Apr 21 01:58:31.448156 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.448086 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" Apr 21 01:58:31.448156 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.448135 2568 scope.go:117] "RemoveContainer" containerID="0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65" Apr 21 01:58:31.450161 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.450135 2568 status_manager.go:895] "Failed to get status for pod" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" err="pods \"limitador-operator-controller-manager-85c4996f8c-59fvl\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" Apr 21 01:58:31.460712 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.460686 2568 status_manager.go:895] "Failed to get status for pod" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" err="pods \"limitador-operator-controller-manager-85c4996f8c-59fvl\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" Apr 21 01:58:31.461244 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.461223 2568 scope.go:117] "RemoveContainer" containerID="0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65" Apr 21 01:58:31.461520 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:58:31.461502 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65\": container with ID starting with 0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65 not found: ID does not exist" containerID="0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65" Apr 21 01:58:31.461584 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:31.461529 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65"} err="failed to get container status \"0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65\": rpc error: code = NotFound desc = could not find container \"0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65\": container with ID starting with 0eb019bb7821eb197ff01dd51702977142194a8c46faa1ca041d206b1e835e65 not found: ID does not exist" Apr 21 01:58:32.453112 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:32.453077 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" event={"ID":"91a20877-a6ca-4350-a13e-d6cbbba878d2","Type":"ContainerStarted","Data":"4cd068af562a1f03cc2d88e140a67a233bac9139a8412013e82808091dc41b3c"} Apr 21 01:58:32.453112 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:32.453117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" event={"ID":"91a20877-a6ca-4350-a13e-d6cbbba878d2","Type":"ContainerStarted","Data":"95734c002a76189c807b200424a17bd93ba3b573bbb0c24dd0d694f6d2b264b0"} Apr 21 01:58:32.453645 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:32.453162 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:58:32.455808 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:32.455786 2568 status_manager.go:895] "Failed to get status for pod" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-59fvl" err="pods \"limitador-operator-controller-manager-85c4996f8c-59fvl\" is forbidden: User \"system:node:ip-10-0-141-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-35.ec2.internal' and this object" Apr 21 01:58:32.493713 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:32.493663 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" podStartSLOduration=2.493649082 podStartE2EDuration="2.493649082s" podCreationTimestamp="2026-04-21 01:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:58:32.492966049 +0000 UTC m=+492.262928215" watchObservedRunningTime="2026-04-21 01:58:32.493649082 +0000 UTC m=+492.263611256" Apr 21 01:58:32.777408 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:32.777328 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb0be52-9b4d-46e9-908a-95ffb37cef44" path="/var/lib/kubelet/pods/2fb0be52-9b4d-46e9-908a-95ffb37cef44/volumes" Apr 21 01:58:43.459196 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:58:43.459166 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-d4cbl" Apr 21 01:59:15.244821 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.244782 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hjbpk"] Apr 21 01:59:15.248216 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.248194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:15.250464 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.250448 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lzqzq\"" Apr 21 01:59:15.254448 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.254426 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hjbpk"] Apr 21 01:59:15.367644 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.367605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfzv\" (UniqueName: \"kubernetes.io/projected/a66be9dd-b082-44c5-bafc-5a2c85fd9d03-kube-api-access-4bfzv\") pod \"authorino-f99f4b5cd-hjbpk\" (UID: \"a66be9dd-b082-44c5-bafc-5a2c85fd9d03\") " pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:15.468871 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.468836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfzv\" (UniqueName: \"kubernetes.io/projected/a66be9dd-b082-44c5-bafc-5a2c85fd9d03-kube-api-access-4bfzv\") pod \"authorino-f99f4b5cd-hjbpk\" (UID: \"a66be9dd-b082-44c5-bafc-5a2c85fd9d03\") " pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:15.476190 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.476162 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfzv\" (UniqueName: \"kubernetes.io/projected/a66be9dd-b082-44c5-bafc-5a2c85fd9d03-kube-api-access-4bfzv\") pod \"authorino-f99f4b5cd-hjbpk\" (UID: \"a66be9dd-b082-44c5-bafc-5a2c85fd9d03\") " pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:15.559913 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.559845 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:15.680581 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:15.680556 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hjbpk"] Apr 21 01:59:15.683322 ip-10-0-141-35 kubenswrapper[2568]: W0421 01:59:15.683279 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda66be9dd_b082_44c5_bafc_5a2c85fd9d03.slice/crio-ca2835717087a1e523946b72fe6169bdab22589bf5ec87dd83e4aad25f2a92c9 WatchSource:0}: Error finding container ca2835717087a1e523946b72fe6169bdab22589bf5ec87dd83e4aad25f2a92c9: Status 404 returned error can't find the container with id ca2835717087a1e523946b72fe6169bdab22589bf5ec87dd83e4aad25f2a92c9 Apr 21 01:59:16.608009 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:16.607944 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" event={"ID":"a66be9dd-b082-44c5-bafc-5a2c85fd9d03","Type":"ContainerStarted","Data":"ca2835717087a1e523946b72fe6169bdab22589bf5ec87dd83e4aad25f2a92c9"} Apr 21 01:59:19.623751 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:19.623717 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" event={"ID":"a66be9dd-b082-44c5-bafc-5a2c85fd9d03","Type":"ContainerStarted","Data":"a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3"} Apr 21 01:59:19.636394 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:19.636347 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" podStartSLOduration=1.486299324 podStartE2EDuration="4.636335522s" podCreationTimestamp="2026-04-21 01:59:15 +0000 UTC" firstStartedPulling="2026-04-21 01:59:15.684602372 +0000 UTC m=+535.454564524" lastFinishedPulling="2026-04-21 01:59:18.834638566 +0000 UTC m=+538.604600722" observedRunningTime="2026-04-21 01:59:19.63628636 +0000 UTC m=+539.406248537" watchObservedRunningTime="2026-04-21 01:59:19.636335522 +0000 UTC m=+539.406297694" Apr 21 01:59:20.237592 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:20.237559 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hjbpk"] Apr 21 01:59:21.630710 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:21.630645 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" podUID="a66be9dd-b082-44c5-bafc-5a2c85fd9d03" containerName="authorino" containerID="cri-o://a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3" gracePeriod=30 Apr 21 01:59:21.873499 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:21.873476 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:21.928581 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:21.928508 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bfzv\" (UniqueName: \"kubernetes.io/projected/a66be9dd-b082-44c5-bafc-5a2c85fd9d03-kube-api-access-4bfzv\") pod \"a66be9dd-b082-44c5-bafc-5a2c85fd9d03\" (UID: \"a66be9dd-b082-44c5-bafc-5a2c85fd9d03\") " Apr 21 01:59:21.930535 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:21.930510 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a66be9dd-b082-44c5-bafc-5a2c85fd9d03-kube-api-access-4bfzv" (OuterVolumeSpecName: "kube-api-access-4bfzv") pod "a66be9dd-b082-44c5-bafc-5a2c85fd9d03" (UID: "a66be9dd-b082-44c5-bafc-5a2c85fd9d03"). InnerVolumeSpecName "kube-api-access-4bfzv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:59:22.029595 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.029561 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bfzv\" (UniqueName: \"kubernetes.io/projected/a66be9dd-b082-44c5-bafc-5a2c85fd9d03-kube-api-access-4bfzv\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 01:59:22.635495 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.635459 2568 generic.go:358] "Generic (PLEG): container finished" podID="a66be9dd-b082-44c5-bafc-5a2c85fd9d03" containerID="a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3" exitCode=0 Apr 21 01:59:22.635902 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.635507 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" Apr 21 01:59:22.635902 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.635502 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" event={"ID":"a66be9dd-b082-44c5-bafc-5a2c85fd9d03","Type":"ContainerDied","Data":"a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3"} Apr 21 01:59:22.635902 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.635618 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-hjbpk" event={"ID":"a66be9dd-b082-44c5-bafc-5a2c85fd9d03","Type":"ContainerDied","Data":"ca2835717087a1e523946b72fe6169bdab22589bf5ec87dd83e4aad25f2a92c9"} Apr 21 01:59:22.635902 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.635647 2568 scope.go:117] "RemoveContainer" containerID="a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3" Apr 21 01:59:22.649478 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.649456 2568 scope.go:117] "RemoveContainer" containerID="a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3" Apr 21 01:59:22.649734 ip-10-0-141-35 kubenswrapper[2568]: E0421 01:59:22.649713 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3\": container with ID starting with a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3 not found: ID does not exist" containerID="a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3" Apr 21 01:59:22.649800 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.649746 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3"} err="failed to get container status \"a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3\": rpc error: code = NotFound desc = could not find container \"a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3\": container with ID starting with a1b76af7e1c9d82688c3017017e332914303f4af6d87a1f7f1a24aea5f8057e3 not found: ID does not exist" Apr 21 01:59:22.659927 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.659888 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hjbpk"] Apr 21 01:59:22.663330 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.663290 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-hjbpk"] Apr 21 01:59:22.777608 ip-10-0-141-35 kubenswrapper[2568]: I0421 01:59:22.777574 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a66be9dd-b082-44c5-bafc-5a2c85fd9d03" path="/var/lib/kubelet/pods/a66be9dd-b082-44c5-bafc-5a2c85fd9d03/volumes" Apr 21 02:00:17.155278 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.155242 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:00:17.155905 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.155744 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a66be9dd-b082-44c5-bafc-5a2c85fd9d03" containerName="authorino" Apr 21 02:00:17.155905 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.155763 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a66be9dd-b082-44c5-bafc-5a2c85fd9d03" containerName="authorino" Apr 21 02:00:17.155905 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.155852 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a66be9dd-b082-44c5-bafc-5a2c85fd9d03" containerName="authorino" Apr 21 02:00:17.157884 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.157867 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:17.160206 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.160181 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 02:00:17.160292 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.160191 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 02:00:17.160292 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.160193 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 02:00:17.160912 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.160894 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-5hfb4\"" Apr 21 02:00:17.165840 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.165817 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:00:17.306663 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.306621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrn5r\" (UniqueName: \"kubernetes.io/projected/68d002c3-c4b3-435b-8c62-6fac57e1cfe1-kube-api-access-hrn5r\") pod \"maas-keycloak-0\" (UID: \"68d002c3-c4b3-435b-8c62-6fac57e1cfe1\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:17.407763 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.407669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrn5r\" (UniqueName: \"kubernetes.io/projected/68d002c3-c4b3-435b-8c62-6fac57e1cfe1-kube-api-access-hrn5r\") pod \"maas-keycloak-0\" (UID: \"68d002c3-c4b3-435b-8c62-6fac57e1cfe1\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:17.415925 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.415887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrn5r\" (UniqueName: \"kubernetes.io/projected/68d002c3-c4b3-435b-8c62-6fac57e1cfe1-kube-api-access-hrn5r\") pod \"maas-keycloak-0\" (UID: \"68d002c3-c4b3-435b-8c62-6fac57e1cfe1\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:17.469470 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.469421 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:17.589927 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.589895 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:00:17.593743 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:00:17.593698 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-3d75830ee517ca4a3e3027318ec5285823cc77cab426c0f52bdc30c22b130f9c WatchSource:0}: Error finding container 3d75830ee517ca4a3e3027318ec5285823cc77cab426c0f52bdc30c22b130f9c: Status 404 returned error can't find the container with id 3d75830ee517ca4a3e3027318ec5285823cc77cab426c0f52bdc30c22b130f9c Apr 21 02:00:17.818290 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:17.818206 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"68d002c3-c4b3-435b-8c62-6fac57e1cfe1","Type":"ContainerStarted","Data":"3d75830ee517ca4a3e3027318ec5285823cc77cab426c0f52bdc30c22b130f9c"} Apr 21 02:00:22.171370 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.171344 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 02:00:22.176517 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.176491 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 02:00:22.217061 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.217033 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 02:00:22.222165 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.222143 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 02:00:22.251301 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.251271 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 02:00:22.839048 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.839007 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"68d002c3-c4b3-435b-8c62-6fac57e1cfe1","Type":"ContainerStarted","Data":"59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7"} Apr 21 02:00:22.855351 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:22.855274 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.2011807349999999 podStartE2EDuration="5.855255342s" podCreationTimestamp="2026-04-21 02:00:17 +0000 UTC" firstStartedPulling="2026-04-21 02:00:17.59509742 +0000 UTC m=+597.365059574" lastFinishedPulling="2026-04-21 02:00:22.249172028 +0000 UTC m=+602.019134181" observedRunningTime="2026-04-21 02:00:22.854451802 +0000 UTC m=+602.624413977" watchObservedRunningTime="2026-04-21 02:00:22.855255342 +0000 UTC m=+602.625217517" Apr 21 02:00:23.470368 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:23.470324 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:23.471886 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:23.471833 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:24.470252 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:24.470195 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:25.470791 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:25.470732 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:26.470900 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:26.470849 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:27.469847 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:27.469790 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:27.471122 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:27.471081 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:28.469896 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:28.469850 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:29.470259 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:29.470207 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:30.470705 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:30.470645 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:31.470169 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:31.470115 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:32.470714 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:32.470664 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:33.470896 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:33.470792 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:34.470644 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:34.470598 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.34:9000/health/started\": dial tcp 10.134.0.34:9000: connect: connection refused" Apr 21 02:00:35.570778 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:35.570734 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:35.589783 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:35.589724 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 02:00:45.577011 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:45.576975 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:00:47.022107 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.022075 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-ff448b775-zg8t8"] Apr 21 02:00:47.033358 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.033333 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ff448b775-zg8t8"] Apr 21 02:00:47.033503 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.033428 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.036481 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.036458 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 02:00:47.036607 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.036457 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lzqzq\"" Apr 21 02:00:47.102761 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.102729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-tls-cert\") pod \"authorino-ff448b775-zg8t8\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.102912 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.102871 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9h8\" (UniqueName: \"kubernetes.io/projected/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-kube-api-access-xz9h8\") pod \"authorino-ff448b775-zg8t8\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.204146 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.204112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9h8\" (UniqueName: \"kubernetes.io/projected/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-kube-api-access-xz9h8\") pod \"authorino-ff448b775-zg8t8\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.204298 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.204163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-tls-cert\") pod \"authorino-ff448b775-zg8t8\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.206532 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.206512 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-tls-cert\") pod \"authorino-ff448b775-zg8t8\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.211229 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.211199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9h8\" (UniqueName: \"kubernetes.io/projected/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-kube-api-access-xz9h8\") pod \"authorino-ff448b775-zg8t8\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.343264 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.343221 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:00:47.683837 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.673067 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-ff448b775-zg8t8"] Apr 21 02:00:47.683837 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:00:47.675383 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4faa7f65_4e93_44a9_b052_ccfbe84b92ac.slice/crio-91e133b43a17c138175be04b538a09b94c001c52bd66d091354eaaaa420567b2 WatchSource:0}: Error finding container 91e133b43a17c138175be04b538a09b94c001c52bd66d091354eaaaa420567b2: Status 404 returned error can't find the container with id 91e133b43a17c138175be04b538a09b94c001c52bd66d091354eaaaa420567b2 Apr 21 02:00:47.941642 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:47.941561 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff448b775-zg8t8" event={"ID":"4faa7f65-4e93-44a9-b052-ccfbe84b92ac","Type":"ContainerStarted","Data":"91e133b43a17c138175be04b538a09b94c001c52bd66d091354eaaaa420567b2"} Apr 21 02:00:48.946351 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:48.946286 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff448b775-zg8t8" event={"ID":"4faa7f65-4e93-44a9-b052-ccfbe84b92ac","Type":"ContainerStarted","Data":"b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803"} Apr 21 02:00:48.963288 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:48.963224 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-ff448b775-zg8t8" podStartSLOduration=1.6531076470000001 podStartE2EDuration="1.963207691s" podCreationTimestamp="2026-04-21 02:00:47 +0000 UTC" firstStartedPulling="2026-04-21 02:00:47.676780515 +0000 UTC m=+627.446742668" lastFinishedPulling="2026-04-21 02:00:47.98688056 +0000 UTC m=+627.756842712" observedRunningTime="2026-04-21 02:00:48.962754525 +0000 UTC m=+628.732716701" watchObservedRunningTime="2026-04-21 02:00:48.963207691 +0000 UTC m=+628.733169867" Apr 21 02:00:49.963961 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:49.963924 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6c746b54f5-s9jdn"] Apr 21 02:00:49.967378 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:49.967359 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:49.969416 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:49.969398 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-s2mck\"" Apr 21 02:00:49.973498 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:49.973476 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6c746b54f5-s9jdn"] Apr 21 02:00:50.032204 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.032170 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrmr\" (UniqueName: \"kubernetes.io/projected/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868-kube-api-access-7rrmr\") pod \"maas-controller-6c746b54f5-s9jdn\" (UID: \"e3f88db0-2e7a-43ea-8cb1-2cdf024d5868\") " pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:50.086721 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.086689 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6c746b54f5-s9jdn"] Apr 21 02:00:50.086967 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:00:50.086947 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7rrmr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-6c746b54f5-s9jdn" podUID="e3f88db0-2e7a-43ea-8cb1-2cdf024d5868" Apr 21 02:00:50.110332 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.110290 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-577f7c4d6b-tkxnh"] Apr 21 02:00:50.113634 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.113617 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:00:50.121159 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.121133 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-577f7c4d6b-tkxnh"] Apr 21 02:00:50.132886 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.132858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrmr\" (UniqueName: \"kubernetes.io/projected/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868-kube-api-access-7rrmr\") pod \"maas-controller-6c746b54f5-s9jdn\" (UID: \"e3f88db0-2e7a-43ea-8cb1-2cdf024d5868\") " pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:50.139986 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.139963 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrmr\" (UniqueName: \"kubernetes.io/projected/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868-kube-api-access-7rrmr\") pod \"maas-controller-6c746b54f5-s9jdn\" (UID: \"e3f88db0-2e7a-43ea-8cb1-2cdf024d5868\") " pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:50.234048 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.233968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsxp\" (UniqueName: \"kubernetes.io/projected/af45f2f6-c299-4b8a-99a2-8f531cc9c25f-kube-api-access-4hsxp\") pod \"maas-controller-577f7c4d6b-tkxnh\" (UID: \"af45f2f6-c299-4b8a-99a2-8f531cc9c25f\") " pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:00:50.334662 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.334627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsxp\" (UniqueName: \"kubernetes.io/projected/af45f2f6-c299-4b8a-99a2-8f531cc9c25f-kube-api-access-4hsxp\") pod \"maas-controller-577f7c4d6b-tkxnh\" (UID: \"af45f2f6-c299-4b8a-99a2-8f531cc9c25f\") " pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:00:50.341725 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.341701 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsxp\" (UniqueName: \"kubernetes.io/projected/af45f2f6-c299-4b8a-99a2-8f531cc9c25f-kube-api-access-4hsxp\") pod \"maas-controller-577f7c4d6b-tkxnh\" (UID: \"af45f2f6-c299-4b8a-99a2-8f531cc9c25f\") " pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:00:50.425642 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.425613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:00:50.545440 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.545410 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-577f7c4d6b-tkxnh"] Apr 21 02:00:50.548750 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:00:50.548721 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf45f2f6_c299_4b8a_99a2_8f531cc9c25f.slice/crio-d4353b27df561fa93b6468b95bec9cdb1864cfe3bf1ba35504c85382014ea406 WatchSource:0}: Error finding container d4353b27df561fa93b6468b95bec9cdb1864cfe3bf1ba35504c85382014ea406: Status 404 returned error can't find the container with id d4353b27df561fa93b6468b95bec9cdb1864cfe3bf1ba35504c85382014ea406 Apr 21 02:00:50.550035 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.550017 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:00:50.954354 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.954297 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" event={"ID":"af45f2f6-c299-4b8a-99a2-8f531cc9c25f","Type":"ContainerStarted","Data":"d4353b27df561fa93b6468b95bec9cdb1864cfe3bf1ba35504c85382014ea406"} Apr 21 02:00:50.954354 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.954335 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:50.959113 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:50.959090 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:51.040453 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:51.040428 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rrmr\" (UniqueName: \"kubernetes.io/projected/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868-kube-api-access-7rrmr\") pod \"e3f88db0-2e7a-43ea-8cb1-2cdf024d5868\" (UID: \"e3f88db0-2e7a-43ea-8cb1-2cdf024d5868\") " Apr 21 02:00:51.042445 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:51.042415 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868-kube-api-access-7rrmr" (OuterVolumeSpecName: "kube-api-access-7rrmr") pod "e3f88db0-2e7a-43ea-8cb1-2cdf024d5868" (UID: "e3f88db0-2e7a-43ea-8cb1-2cdf024d5868"). InnerVolumeSpecName "kube-api-access-7rrmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:00:51.141764 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:51.141734 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rrmr\" (UniqueName: \"kubernetes.io/projected/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868-kube-api-access-7rrmr\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:00:51.958205 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:51.958171 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6c746b54f5-s9jdn" Apr 21 02:00:51.987963 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:51.987933 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6c746b54f5-s9jdn"] Apr 21 02:00:51.991590 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:51.991562 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6c746b54f5-s9jdn"] Apr 21 02:00:52.779934 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:52.779904 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3f88db0-2e7a-43ea-8cb1-2cdf024d5868" path="/var/lib/kubelet/pods/e3f88db0-2e7a-43ea-8cb1-2cdf024d5868/volumes" Apr 21 02:00:53.968218 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:53.968179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" event={"ID":"af45f2f6-c299-4b8a-99a2-8f531cc9c25f","Type":"ContainerStarted","Data":"39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818"} Apr 21 02:00:53.968712 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:53.968420 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:00:53.983805 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:53.983756 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" podStartSLOduration=1.587761461 podStartE2EDuration="3.983743119s" podCreationTimestamp="2026-04-21 02:00:50 +0000 UTC" firstStartedPulling="2026-04-21 02:00:50.550141029 +0000 UTC m=+630.320103182" lastFinishedPulling="2026-04-21 02:00:52.946122685 +0000 UTC m=+632.716084840" observedRunningTime="2026-04-21 02:00:53.981330561 +0000 UTC m=+633.751292730" watchObservedRunningTime="2026-04-21 02:00:53.983743119 +0000 UTC m=+633.753705334" Apr 21 02:00:54.565866 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.565831 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6cb66fb9d4-m62mr"] Apr 21 02:00:54.569366 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.569346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:54.571562 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.571545 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 02:00:54.571662 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.571547 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 02:00:54.577743 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.577717 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6cb66fb9d4-m62mr"] Apr 21 02:00:54.670037 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.670002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:54.670037 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.670041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplvt\" (UniqueName: \"kubernetes.io/projected/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-kube-api-access-tplvt\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:54.770603 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.770566 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:54.770776 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.770610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tplvt\" (UniqueName: \"kubernetes.io/projected/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-kube-api-access-tplvt\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:54.770776 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:00:54.770710 2568 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 21 02:00:54.770776 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:00:54.770775 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls podName:b35a6a05-9dcb-4d79-afbf-ad313d84c2d7 nodeName:}" failed. No retries permitted until 2026-04-21 02:00:55.270758827 +0000 UTC m=+635.040720979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls") pod "maas-api-6cb66fb9d4-m62mr" (UID: "b35a6a05-9dcb-4d79-afbf-ad313d84c2d7") : secret "maas-api-serving-cert" not found Apr 21 02:00:54.779935 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:54.779902 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplvt\" (UniqueName: \"kubernetes.io/projected/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-kube-api-access-tplvt\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:55.274364 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:55.274297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:55.276764 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:55.276742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls\") pod \"maas-api-6cb66fb9d4-m62mr\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:55.480652 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:55.480614 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:55.602878 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:55.602763 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6cb66fb9d4-m62mr"] Apr 21 02:00:55.605687 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:00:55.605655 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb35a6a05_9dcb_4d79_afbf_ad313d84c2d7.slice/crio-e96469b63feaa4c36bfb55bebd2b6ff9f3c8a21ffda6ec8db618e2b858eec0de WatchSource:0}: Error finding container e96469b63feaa4c36bfb55bebd2b6ff9f3c8a21ffda6ec8db618e2b858eec0de: Status 404 returned error can't find the container with id e96469b63feaa4c36bfb55bebd2b6ff9f3c8a21ffda6ec8db618e2b858eec0de Apr 21 02:00:55.977179 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:55.977147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" event={"ID":"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7","Type":"ContainerStarted","Data":"e96469b63feaa4c36bfb55bebd2b6ff9f3c8a21ffda6ec8db618e2b858eec0de"} Apr 21 02:00:57.987427 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:57.987384 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" event={"ID":"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7","Type":"ContainerStarted","Data":"5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8"} Apr 21 02:00:57.987817 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:57.987445 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:00:58.004494 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:00:58.004443 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" podStartSLOduration=2.464180381 podStartE2EDuration="4.004426054s" podCreationTimestamp="2026-04-21 02:00:54 +0000 UTC" firstStartedPulling="2026-04-21 02:00:55.606948403 +0000 UTC m=+635.376910557" lastFinishedPulling="2026-04-21 02:00:57.147194072 +0000 UTC m=+636.917156230" observedRunningTime="2026-04-21 02:00:58.002584575 +0000 UTC m=+637.772546750" watchObservedRunningTime="2026-04-21 02:00:58.004426054 +0000 UTC m=+637.774388229" Apr 21 02:01:03.996328 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:03.996271 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:01:04.978082 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:04.978052 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:01:05.305758 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.305660 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-c78dcbcb5-gdqr6"] Apr 21 02:01:05.309452 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.309430 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:05.315358 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.315331 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c78dcbcb5-gdqr6"] Apr 21 02:01:05.466518 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.466481 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46s79\" (UniqueName: \"kubernetes.io/projected/03e4ec52-d66b-4e41-b9c4-8e26ed7b288b-kube-api-access-46s79\") pod \"maas-controller-c78dcbcb5-gdqr6\" (UID: \"03e4ec52-d66b-4e41-b9c4-8e26ed7b288b\") " pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:05.567139 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.567055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46s79\" (UniqueName: \"kubernetes.io/projected/03e4ec52-d66b-4e41-b9c4-8e26ed7b288b-kube-api-access-46s79\") pod \"maas-controller-c78dcbcb5-gdqr6\" (UID: \"03e4ec52-d66b-4e41-b9c4-8e26ed7b288b\") " pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:05.575650 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.575623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46s79\" (UniqueName: \"kubernetes.io/projected/03e4ec52-d66b-4e41-b9c4-8e26ed7b288b-kube-api-access-46s79\") pod \"maas-controller-c78dcbcb5-gdqr6\" (UID: \"03e4ec52-d66b-4e41-b9c4-8e26ed7b288b\") " pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:05.621349 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.621281 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:05.745054 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:05.745021 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c78dcbcb5-gdqr6"] Apr 21 02:01:05.755415 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:01:05.755380 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e4ec52_d66b_4e41_b9c4_8e26ed7b288b.slice/crio-691085d9afb93231a03c930e156d58a9ebaafb9060db816d84a25bd7e4c791ef WatchSource:0}: Error finding container 691085d9afb93231a03c930e156d58a9ebaafb9060db816d84a25bd7e4c791ef: Status 404 returned error can't find the container with id 691085d9afb93231a03c930e156d58a9ebaafb9060db816d84a25bd7e4c791ef Apr 21 02:01:06.015680 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:06.015645 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" event={"ID":"03e4ec52-d66b-4e41-b9c4-8e26ed7b288b","Type":"ContainerStarted","Data":"691085d9afb93231a03c930e156d58a9ebaafb9060db816d84a25bd7e4c791ef"} Apr 21 02:01:07.020194 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:07.020159 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" event={"ID":"03e4ec52-d66b-4e41-b9c4-8e26ed7b288b","Type":"ContainerStarted","Data":"269c85a9e5dbc86c0201fa30a15d60f4211240e814752f2f4cbee48f0337e8f7"} Apr 21 02:01:07.020583 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:07.020303 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:07.034948 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:07.034905 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" podStartSLOduration=1.7405920849999998 podStartE2EDuration="2.034892421s" podCreationTimestamp="2026-04-21 02:01:05 +0000 UTC" firstStartedPulling="2026-04-21 02:01:05.756670125 +0000 UTC m=+645.526632279" lastFinishedPulling="2026-04-21 02:01:06.050970463 +0000 UTC m=+645.820932615" observedRunningTime="2026-04-21 02:01:07.034005207 +0000 UTC m=+646.803967382" watchObservedRunningTime="2026-04-21 02:01:07.034892421 +0000 UTC m=+646.804854596" Apr 21 02:01:18.029278 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.029243 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-c78dcbcb5-gdqr6" Apr 21 02:01:18.073183 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.073156 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-577f7c4d6b-tkxnh"] Apr 21 02:01:18.073419 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.073395 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" podUID="af45f2f6-c299-4b8a-99a2-8f531cc9c25f" containerName="manager" containerID="cri-o://39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818" gracePeriod=10 Apr 21 02:01:18.311676 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.311653 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:01:18.481123 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.481088 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hsxp\" (UniqueName: \"kubernetes.io/projected/af45f2f6-c299-4b8a-99a2-8f531cc9c25f-kube-api-access-4hsxp\") pod \"af45f2f6-c299-4b8a-99a2-8f531cc9c25f\" (UID: \"af45f2f6-c299-4b8a-99a2-8f531cc9c25f\") " Apr 21 02:01:18.483232 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.483197 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af45f2f6-c299-4b8a-99a2-8f531cc9c25f-kube-api-access-4hsxp" (OuterVolumeSpecName: "kube-api-access-4hsxp") pod "af45f2f6-c299-4b8a-99a2-8f531cc9c25f" (UID: "af45f2f6-c299-4b8a-99a2-8f531cc9c25f"). InnerVolumeSpecName "kube-api-access-4hsxp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:01:18.582782 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:18.582705 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hsxp\" (UniqueName: \"kubernetes.io/projected/af45f2f6-c299-4b8a-99a2-8f531cc9c25f-kube-api-access-4hsxp\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:01:19.063074 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.063038 2568 generic.go:358] "Generic (PLEG): container finished" podID="af45f2f6-c299-4b8a-99a2-8f531cc9c25f" containerID="39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818" exitCode=0 Apr 21 02:01:19.063477 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.063106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" event={"ID":"af45f2f6-c299-4b8a-99a2-8f531cc9c25f","Type":"ContainerDied","Data":"39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818"} Apr 21 02:01:19.063477 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.063125 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" Apr 21 02:01:19.063477 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.063139 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-577f7c4d6b-tkxnh" event={"ID":"af45f2f6-c299-4b8a-99a2-8f531cc9c25f","Type":"ContainerDied","Data":"d4353b27df561fa93b6468b95bec9cdb1864cfe3bf1ba35504c85382014ea406"} Apr 21 02:01:19.063477 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.063158 2568 scope.go:117] "RemoveContainer" containerID="39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818" Apr 21 02:01:19.071547 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.071524 2568 scope.go:117] "RemoveContainer" containerID="39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818" Apr 21 02:01:19.071795 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:19.071774 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818\": container with ID starting with 39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818 not found: ID does not exist" containerID="39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818" Apr 21 02:01:19.071882 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.071802 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818"} err="failed to get container status \"39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818\": rpc error: code = NotFound desc = could not find container \"39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818\": container with ID starting with 39a4e96fa973a0458f73b16ab078da2e84b7d437ba7d27b64975420448b58818 not found: ID does not exist" Apr 21 02:01:19.077916 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.077893 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-577f7c4d6b-tkxnh"] Apr 21 02:01:19.081301 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.081276 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-577f7c4d6b-tkxnh"] Apr 21 02:01:19.245919 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.245889 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:01:19.246126 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:19.246102 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" containerID="cri-o://59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7" gracePeriod=30 Apr 21 02:01:20.777622 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:20.777595 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af45f2f6-c299-4b8a-99a2-8f531cc9c25f" path="/var/lib/kubelet/pods/af45f2f6-c299-4b8a-99a2-8f531cc9c25f/volumes" Apr 21 02:01:21.075783 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:21.075705 2568 generic.go:358] "Generic (PLEG): container finished" podID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerID="59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7" exitCode=143 Apr 21 02:01:21.075783 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:21.075748 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"68d002c3-c4b3-435b-8c62-6fac57e1cfe1","Type":"ContainerDied","Data":"59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7"} Apr 21 02:01:21.287990 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:21.287965 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:21.308709 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:21.308681 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrn5r\" (UniqueName: \"kubernetes.io/projected/68d002c3-c4b3-435b-8c62-6fac57e1cfe1-kube-api-access-hrn5r\") pod \"68d002c3-c4b3-435b-8c62-6fac57e1cfe1\" (UID: \"68d002c3-c4b3-435b-8c62-6fac57e1cfe1\") " Apr 21 02:01:21.310791 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:21.310765 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d002c3-c4b3-435b-8c62-6fac57e1cfe1-kube-api-access-hrn5r" (OuterVolumeSpecName: "kube-api-access-hrn5r") pod "68d002c3-c4b3-435b-8c62-6fac57e1cfe1" (UID: "68d002c3-c4b3-435b-8c62-6fac57e1cfe1"). InnerVolumeSpecName "kube-api-access-hrn5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:01:21.409775 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:21.409744 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrn5r\" (UniqueName: \"kubernetes.io/projected/68d002c3-c4b3-435b-8c62-6fac57e1cfe1-kube-api-access-hrn5r\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:01:22.080137 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.080103 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.080596 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.080107 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"68d002c3-c4b3-435b-8c62-6fac57e1cfe1","Type":"ContainerDied","Data":"3d75830ee517ca4a3e3027318ec5285823cc77cab426c0f52bdc30c22b130f9c"} Apr 21 02:01:22.080596 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.080218 2568 scope.go:117] "RemoveContainer" containerID="59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7" Apr 21 02:01:22.102067 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.102043 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:01:22.105510 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.105487 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:01:22.125589 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.125565 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:01:22.125924 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.125912 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" Apr 21 02:01:22.125961 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.125926 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" Apr 21 02:01:22.125961 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.125938 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af45f2f6-c299-4b8a-99a2-8f531cc9c25f" containerName="manager" Apr 21 02:01:22.125961 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.125944 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="af45f2f6-c299-4b8a-99a2-8f531cc9c25f" containerName="manager" Apr 21 02:01:22.126051 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.126001 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" containerName="keycloak" Apr 21 02:01:22.126051 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.126012 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="af45f2f6-c299-4b8a-99a2-8f531cc9c25f" containerName="manager" Apr 21 02:01:22.128465 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.128448 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.130661 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.130636 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 02:01:22.130661 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.130657 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 02:01:22.130820 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.130664 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-5hfb4\"" Apr 21 02:01:22.130820 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.130670 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 21 02:01:22.130820 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.130674 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 02:01:22.136655 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.136635 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:01:22.215562 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.215526 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckdj\" (UniqueName: \"kubernetes.io/projected/b985d5b6-aed9-427e-a354-f119223e908f-kube-api-access-bckdj\") pod \"maas-keycloak-0\" (UID: \"b985d5b6-aed9-427e-a354-f119223e908f\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.215726 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.215575 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/b985d5b6-aed9-427e-a354-f119223e908f-test-realms\") pod \"maas-keycloak-0\" (UID: \"b985d5b6-aed9-427e-a354-f119223e908f\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.316864 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.316833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bckdj\" (UniqueName: \"kubernetes.io/projected/b985d5b6-aed9-427e-a354-f119223e908f-kube-api-access-bckdj\") pod \"maas-keycloak-0\" (UID: \"b985d5b6-aed9-427e-a354-f119223e908f\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.317015 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.316877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/b985d5b6-aed9-427e-a354-f119223e908f-test-realms\") pod \"maas-keycloak-0\" (UID: \"b985d5b6-aed9-427e-a354-f119223e908f\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.317564 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.317546 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/b985d5b6-aed9-427e-a354-f119223e908f-test-realms\") pod \"maas-keycloak-0\" (UID: \"b985d5b6-aed9-427e-a354-f119223e908f\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.324154 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.324129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckdj\" (UniqueName: \"kubernetes.io/projected/b985d5b6-aed9-427e-a354-f119223e908f-kube-api-access-bckdj\") pod \"maas-keycloak-0\" (UID: \"b985d5b6-aed9-427e-a354-f119223e908f\") " pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.439147 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.439114 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:22.571847 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.571799 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 02:01:22.575735 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:01:22.575703 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb985d5b6_aed9_427e_a354_f119223e908f.slice/crio-466e267bff4806b2ee2f3f2aacd7ec1a2542750677e6eae87f8d9d0866ea60df WatchSource:0}: Error finding container 466e267bff4806b2ee2f3f2aacd7ec1a2542750677e6eae87f8d9d0866ea60df: Status 404 returned error can't find the container with id 466e267bff4806b2ee2f3f2aacd7ec1a2542750677e6eae87f8d9d0866ea60df Apr 21 02:01:22.586534 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:22.586508 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:22.777873 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:22.777794 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d002c3-c4b3-435b-8c62-6fac57e1cfe1" path="/var/lib/kubelet/pods/68d002c3-c4b3-435b-8c62-6fac57e1cfe1/volumes" Apr 21 02:01:23.086344 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:23.086297 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"b985d5b6-aed9-427e-a354-f119223e908f","Type":"ContainerStarted","Data":"a6864fb450602afac9a59679644d8d80fafd93335a5881e822e245a18d3ad422"} Apr 21 02:01:23.086741 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:23.086351 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"b985d5b6-aed9-427e-a354-f119223e908f","Type":"ContainerStarted","Data":"466e267bff4806b2ee2f3f2aacd7ec1a2542750677e6eae87f8d9d0866ea60df"} Apr 21 02:01:23.104013 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:23.103969 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=0.705617717 podStartE2EDuration="1.103954329s" podCreationTimestamp="2026-04-21 02:01:22 +0000 UTC" firstStartedPulling="2026-04-21 02:01:22.577608644 +0000 UTC m=+662.347570809" lastFinishedPulling="2026-04-21 02:01:22.975945265 +0000 UTC m=+662.745907421" observedRunningTime="2026-04-21 02:01:23.103244672 +0000 UTC m=+662.873206848" watchObservedRunningTime="2026-04-21 02:01:23.103954329 +0000 UTC m=+662.873916503" Apr 21 02:01:23.440471 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:23.440374 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:23.442243 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:23.442206 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:24.440246 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:24.440197 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:25.263461 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:25.263416 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:25.440316 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:25.440264 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:26.440365 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:26.440286 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:27.440424 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:27.440376 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:28.439742 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:28.439689 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:29.440073 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:29.440023 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:30.439759 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:30.439703 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:31.440579 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:31.440532 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:32.440117 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:32.440076 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:32.440453 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:32.440410 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:32.644768 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:32.644731 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:33.439742 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:33.439681 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:34.440079 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:34.440032 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:34.792685 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:34.792598 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6cb66fb9d4-m62mr"] Apr 21 02:01:34.794643 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:34.794603 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" podUID="b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" containerName="maas-api" containerID="cri-o://5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8" gracePeriod=30 Apr 21 02:01:35.067529 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.067503 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:01:35.142038 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.141980 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls\") pod \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " Apr 21 02:01:35.142231 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.142088 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tplvt\" (UniqueName: \"kubernetes.io/projected/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-kube-api-access-tplvt\") pod \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\" (UID: \"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7\") " Apr 21 02:01:35.143159 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.142857 2568 generic.go:358] "Generic (PLEG): container finished" podID="b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" containerID="5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8" exitCode=0 Apr 21 02:01:35.143159 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.142929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" event={"ID":"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7","Type":"ContainerDied","Data":"5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8"} Apr 21 02:01:35.143159 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.142935 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" Apr 21 02:01:35.143159 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.142961 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6cb66fb9d4-m62mr" event={"ID":"b35a6a05-9dcb-4d79-afbf-ad313d84c2d7","Type":"ContainerDied","Data":"e96469b63feaa4c36bfb55bebd2b6ff9f3c8a21ffda6ec8db618e2b858eec0de"} Apr 21 02:01:35.143159 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.142984 2568 scope.go:117] "RemoveContainer" containerID="5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8" Apr 21 02:01:35.144839 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.144809 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-kube-api-access-tplvt" (OuterVolumeSpecName: "kube-api-access-tplvt") pod "b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" (UID: "b35a6a05-9dcb-4d79-afbf-ad313d84c2d7"). InnerVolumeSpecName "kube-api-access-tplvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:01:35.144946 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.144887 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" (UID: "b35a6a05-9dcb-4d79-afbf-ad313d84c2d7"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:01:35.171446 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.171415 2568 scope.go:117] "RemoveContainer" containerID="5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8" Apr 21 02:01:35.171778 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:35.171748 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8\": container with ID starting with 5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8 not found: ID does not exist" containerID="5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8" Apr 21 02:01:35.171867 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.171785 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8"} err="failed to get container status \"5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8\": rpc error: code = NotFound desc = could not find container \"5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8\": container with ID starting with 5b2213041daebd54e1342f380037147aee630e75bdb0286338774be234b3e5f8 not found: ID does not exist" Apr 21 02:01:35.243984 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.243948 2568 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-maas-api-tls\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:01:35.243984 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.243975 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tplvt\" (UniqueName: \"kubernetes.io/projected/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7-kube-api-access-tplvt\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:01:35.440594 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.440548 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.40:9000/health/started\": dial tcp 10.134.0.40:9000: connect: connection refused" Apr 21 02:01:35.464237 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.464205 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6cb66fb9d4-m62mr"] Apr 21 02:01:35.467375 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:35.467339 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6cb66fb9d4-m62mr"] Apr 21 02:01:36.557878 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:36.557826 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:36.573577 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:36.573078 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="b985d5b6-aed9-427e-a354-f119223e908f" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 02:01:36.780647 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:36.780602 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" path="/var/lib/kubelet/pods/b35a6a05-9dcb-4d79-afbf-ad313d84c2d7/volumes" Apr 21 02:01:37.403790 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:37.403746 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:37.403790 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:37.403767 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:40.301713 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:40.301659 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:42.656824 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:42.656781 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:46.564950 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:46.564919 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 02:01:52.704503 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:52.704465 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:55.249230 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:55.249193 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:01:58.897993 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:58.897960 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-ff448b775-zg8t8"] Apr 21 02:01:58.898620 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:58.898227 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-ff448b775-zg8t8" podUID="4faa7f65-4e93-44a9-b052-ccfbe84b92ac" containerName="authorino" containerID="cri-o://b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803" gracePeriod=30 Apr 21 02:01:59.149103 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.149046 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:01:59.238368 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.238326 2568 generic.go:358] "Generic (PLEG): container finished" podID="4faa7f65-4e93-44a9-b052-ccfbe84b92ac" containerID="b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803" exitCode=0 Apr 21 02:01:59.238532 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.238367 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff448b775-zg8t8" event={"ID":"4faa7f65-4e93-44a9-b052-ccfbe84b92ac","Type":"ContainerDied","Data":"b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803"} Apr 21 02:01:59.238532 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.238389 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-ff448b775-zg8t8" Apr 21 02:01:59.238532 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.238412 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-ff448b775-zg8t8" event={"ID":"4faa7f65-4e93-44a9-b052-ccfbe84b92ac","Type":"ContainerDied","Data":"91e133b43a17c138175be04b538a09b94c001c52bd66d091354eaaaa420567b2"} Apr 21 02:01:59.238532 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.238428 2568 scope.go:117] "RemoveContainer" containerID="b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803" Apr 21 02:01:59.247049 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.247021 2568 scope.go:117] "RemoveContainer" containerID="b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803" Apr 21 02:01:59.247296 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:01:59.247277 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803\": container with ID starting with b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803 not found: ID does not exist" containerID="b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803" Apr 21 02:01:59.247382 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.247330 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803"} err="failed to get container status \"b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803\": rpc error: code = NotFound desc = could not find container \"b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803\": container with ID starting with b9c9fb929a0ef19016f96f9ab30d2a20d6ba9906574ca3c74b235e0195d23803 not found: ID does not exist" Apr 21 02:01:59.274640 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.274608 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz9h8\" (UniqueName: \"kubernetes.io/projected/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-kube-api-access-xz9h8\") pod \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " Apr 21 02:01:59.274777 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.274650 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-tls-cert\") pod \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\" (UID: \"4faa7f65-4e93-44a9-b052-ccfbe84b92ac\") " Apr 21 02:01:59.276776 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.276745 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-kube-api-access-xz9h8" (OuterVolumeSpecName: "kube-api-access-xz9h8") pod "4faa7f65-4e93-44a9-b052-ccfbe84b92ac" (UID: "4faa7f65-4e93-44a9-b052-ccfbe84b92ac"). InnerVolumeSpecName "kube-api-access-xz9h8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:01:59.285090 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.285066 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "4faa7f65-4e93-44a9-b052-ccfbe84b92ac" (UID: "4faa7f65-4e93-44a9-b052-ccfbe84b92ac"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:01:59.375912 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.375880 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xz9h8\" (UniqueName: \"kubernetes.io/projected/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-kube-api-access-xz9h8\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:01:59.375912 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.375906 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/4faa7f65-4e93-44a9-b052-ccfbe84b92ac-tls-cert\") on node \"ip-10-0-141-35.ec2.internal\" DevicePath \"\"" Apr 21 02:01:59.559036 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.559003 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-ff448b775-zg8t8"] Apr 21 02:01:59.562380 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:01:59.562358 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-ff448b775-zg8t8"] Apr 21 02:02:00.783347 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:00.783290 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4faa7f65-4e93-44a9-b052-ccfbe84b92ac" path="/var/lib/kubelet/pods/4faa7f65-4e93-44a9-b052-ccfbe84b92ac/volumes" Apr 21 02:02:02.752251 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:02:02.752216 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:02:07.363233 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.363034 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh"] Apr 21 02:02:07.363909 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.363878 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" containerName="maas-api" Apr 21 02:02:07.363909 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.363905 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" containerName="maas-api" Apr 21 02:02:07.364049 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.363959 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4faa7f65-4e93-44a9-b052-ccfbe84b92ac" containerName="authorino" Apr 21 02:02:07.364049 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.363968 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa7f65-4e93-44a9-b052-ccfbe84b92ac" containerName="authorino" Apr 21 02:02:07.364183 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.364169 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b35a6a05-9dcb-4d79-afbf-ad313d84c2d7" containerName="maas-api" Apr 21 02:02:07.364246 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.364198 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4faa7f65-4e93-44a9-b052-ccfbe84b92ac" containerName="authorino" Apr 21 02:02:07.367258 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.367236 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.370912 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.370889 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 02:02:07.371627 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.371599 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh"] Apr 21 02:02:07.371747 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.371642 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-rcr7n\"" Apr 21 02:02:07.371747 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.371650 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 02:02:07.371747 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.371649 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 02:02:07.437723 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.437681 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.437881 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.437738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.437881 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.437820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.437881 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.437852 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qznnw\" (UniqueName: \"kubernetes.io/projected/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-kube-api-access-qznnw\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.437881 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.437873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.438016 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.437903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.538895 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.538858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539084 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.538905 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539084 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539032 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539084 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qznnw\" (UniqueName: \"kubernetes.io/projected/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-kube-api-access-qznnw\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539247 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539247 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539389 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539443 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.539443 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.539430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.541424 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.541395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.541627 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.541610 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.546401 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.546379 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qznnw\" (UniqueName: \"kubernetes.io/projected/48e1075e-4fa4-45ff-99c6-6f8bb616c7b9-kube-api-access-qznnw\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh\" (UID: \"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.679495 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.679409 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:07.803698 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:07.803674 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh"] Apr 21 02:02:07.806503 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:02:07.806450 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e1075e_4fa4_45ff_99c6_6f8bb616c7b9.slice/crio-20f3b56acb8e1bc9eed7b37c6db2ae78345151d67dfc97258f4dc2657cc6c656 WatchSource:0}: Error finding container 20f3b56acb8e1bc9eed7b37c6db2ae78345151d67dfc97258f4dc2657cc6c656: Status 404 returned error can't find the container with id 20f3b56acb8e1bc9eed7b37c6db2ae78345151d67dfc97258f4dc2657cc6c656 Apr 21 02:02:08.270585 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:08.270552 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" event={"ID":"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9","Type":"ContainerStarted","Data":"20f3b56acb8e1bc9eed7b37c6db2ae78345151d67dfc97258f4dc2657cc6c656"} Apr 21 02:02:10.307828 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:02:10.307788 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:02:12.762591 ip-10-0-141-35 kubenswrapper[2568]: E0421 02:02:12.762554 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d002c3_c4b3_435b_8c62_6fac57e1cfe1.slice/crio-59977a7e5237c632a0efcd642b7c0c23059ecbfb0d275a4f4aa48923ca8b4ff7.scope\": RecentStats: unable to find data in memory cache]" Apr 21 02:02:15.295788 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:15.295742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" event={"ID":"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9","Type":"ContainerStarted","Data":"c2f09db67354220d3c200f43d4a6ac0d58212dce410aad154e7e58710bca4c37"} Apr 21 02:02:16.446131 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.446094 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2"] Apr 21 02:02:16.450082 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.450058 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.452387 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.452364 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 02:02:16.461054 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.461006 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2"] Apr 21 02:02:16.527105 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.527065 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.527264 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.527121 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.527264 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.527185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.527409 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.527292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.527467 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.527402 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/657fe588-2dc9-401b-812e-6fee3fa8133a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.527517 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.527473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cllds\" (UniqueName: \"kubernetes.io/projected/657fe588-2dc9-401b-812e-6fee3fa8133a-kube-api-access-cllds\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.628917 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.628881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629092 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.628934 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629092 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.628971 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/657fe588-2dc9-401b-812e-6fee3fa8133a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629092 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.629024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cllds\" (UniqueName: \"kubernetes.io/projected/657fe588-2dc9-401b-812e-6fee3fa8133a-kube-api-access-cllds\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629368 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.629346 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629451 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.629382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629451 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.629434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629708 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.629689 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.629787 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.629738 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.631802 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.631778 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/657fe588-2dc9-401b-812e-6fee3fa8133a-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.632037 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.632019 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/657fe588-2dc9-401b-812e-6fee3fa8133a-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.635929 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.635906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cllds\" (UniqueName: \"kubernetes.io/projected/657fe588-2dc9-401b-812e-6fee3fa8133a-kube-api-access-cllds\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2\" (UID: \"657fe588-2dc9-401b-812e-6fee3fa8133a\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.766025 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.765938 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:16.896604 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:16.896579 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2"] Apr 21 02:02:16.899163 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:02:16.899127 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657fe588_2dc9_401b_812e_6fee3fa8133a.slice/crio-130e6b262a62b32b4a9e062b5d8ea23d6654396476176c86fd544cd4b38bcbab WatchSource:0}: Error finding container 130e6b262a62b32b4a9e062b5d8ea23d6654396476176c86fd544cd4b38bcbab: Status 404 returned error can't find the container with id 130e6b262a62b32b4a9e062b5d8ea23d6654396476176c86fd544cd4b38bcbab Apr 21 02:02:17.304363 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:17.304329 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" event={"ID":"657fe588-2dc9-401b-812e-6fee3fa8133a","Type":"ContainerStarted","Data":"fa2ccb0bf158fc4868158a7ffd3150903f94876ce5cfd4d5fed468d2414f6bed"} Apr 21 02:02:17.304363 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:17.304369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" event={"ID":"657fe588-2dc9-401b-812e-6fee3fa8133a","Type":"ContainerStarted","Data":"130e6b262a62b32b4a9e062b5d8ea23d6654396476176c86fd544cd4b38bcbab"} Apr 21 02:02:20.317447 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:20.317409 2568 generic.go:358] "Generic (PLEG): container finished" podID="48e1075e-4fa4-45ff-99c6-6f8bb616c7b9" containerID="c2f09db67354220d3c200f43d4a6ac0d58212dce410aad154e7e58710bca4c37" exitCode=0 Apr 21 02:02:20.317843 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:20.317477 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" event={"ID":"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9","Type":"ContainerDied","Data":"c2f09db67354220d3c200f43d4a6ac0d58212dce410aad154e7e58710bca4c37"} Apr 21 02:02:21.158869 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.158798 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq"] Apr 21 02:02:21.163201 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.163176 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.165429 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.165404 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 02:02:21.170514 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.170488 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq"] Apr 21 02:02:21.272937 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.272897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.273118 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.273015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.273118 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.273078 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4dn\" (UniqueName: \"kubernetes.io/projected/c42c4647-9994-4c36-8b7e-592e82fab77c-kube-api-access-sr4dn\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.273234 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.273131 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.273234 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.273155 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4647-9994-4c36-8b7e-592e82fab77c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.273350 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.273226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.374524 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.374483 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.374995 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.374562 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4dn\" (UniqueName: \"kubernetes.io/projected/c42c4647-9994-4c36-8b7e-592e82fab77c-kube-api-access-sr4dn\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.374995 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.374612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.374995 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.374642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4647-9994-4c36-8b7e-592e82fab77c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.374995 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.374688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.374995 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.374723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.375260 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.375233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.375481 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.375457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.375481 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.375470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.377814 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.377792 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c42c4647-9994-4c36-8b7e-592e82fab77c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.378001 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.377982 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c42c4647-9994-4c36-8b7e-592e82fab77c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.382821 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.382778 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4dn\" (UniqueName: \"kubernetes.io/projected/c42c4647-9994-4c36-8b7e-592e82fab77c-kube-api-access-sr4dn\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq\" (UID: \"c42c4647-9994-4c36-8b7e-592e82fab77c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.477940 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.477916 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:21.622717 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:21.622689 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq"] Apr 21 02:02:21.624469 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:02:21.624435 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42c4647_9994_4c36_8b7e_592e82fab77c.slice/crio-03403db7d7692be854b503c840f43ad14e2737614e6c9d82676901c2ebf13930 WatchSource:0}: Error finding container 03403db7d7692be854b503c840f43ad14e2737614e6c9d82676901c2ebf13930: Status 404 returned error can't find the container with id 03403db7d7692be854b503c840f43ad14e2737614e6c9d82676901c2ebf13930 Apr 21 02:02:22.327075 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:22.327026 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" event={"ID":"c42c4647-9994-4c36-8b7e-592e82fab77c","Type":"ContainerStarted","Data":"328c174f649c1b22ff0572f8bc6681a287df3236fba5a40e029c400fb11d7c12"} Apr 21 02:02:22.327278 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:22.327087 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" event={"ID":"c42c4647-9994-4c36-8b7e-592e82fab77c","Type":"ContainerStarted","Data":"03403db7d7692be854b503c840f43ad14e2737614e6c9d82676901c2ebf13930"} Apr 21 02:02:22.328842 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:22.328816 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" event={"ID":"48e1075e-4fa4-45ff-99c6-6f8bb616c7b9","Type":"ContainerStarted","Data":"caf72c28d754d23a466e571726e9fae488ff369bc0886a3a5b0d2fd6a7e6ba48"} Apr 21 02:02:22.329038 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:22.329019 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:22.359359 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:22.359295 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" podStartSLOduration=1.702240406 podStartE2EDuration="15.359276022s" podCreationTimestamp="2026-04-21 02:02:07 +0000 UTC" firstStartedPulling="2026-04-21 02:02:07.808690822 +0000 UTC m=+707.578652975" lastFinishedPulling="2026-04-21 02:02:21.465726439 +0000 UTC m=+721.235688591" observedRunningTime="2026-04-21 02:02:22.357947246 +0000 UTC m=+722.127909422" watchObservedRunningTime="2026-04-21 02:02:22.359276022 +0000 UTC m=+722.129238234" Apr 21 02:02:23.333700 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:23.333669 2568 generic.go:358] "Generic (PLEG): container finished" podID="657fe588-2dc9-401b-812e-6fee3fa8133a" containerID="fa2ccb0bf158fc4868158a7ffd3150903f94876ce5cfd4d5fed468d2414f6bed" exitCode=0 Apr 21 02:02:23.334078 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:23.333745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" event={"ID":"657fe588-2dc9-401b-812e-6fee3fa8133a","Type":"ContainerDied","Data":"fa2ccb0bf158fc4868158a7ffd3150903f94876ce5cfd4d5fed468d2414f6bed"} Apr 21 02:02:24.338297 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:24.338264 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" event={"ID":"657fe588-2dc9-401b-812e-6fee3fa8133a","Type":"ContainerStarted","Data":"9b272a1de3d6611d25c63f65d006b6d52cf81ea8aa57ba84047746ba1fed5a28"} Apr 21 02:02:24.338666 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:24.338518 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:24.356368 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:24.356299 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" podStartSLOduration=8.176980723 podStartE2EDuration="8.356287966s" podCreationTimestamp="2026-04-21 02:02:16 +0000 UTC" firstStartedPulling="2026-04-21 02:02:23.33459867 +0000 UTC m=+723.104560823" lastFinishedPulling="2026-04-21 02:02:23.513905901 +0000 UTC m=+723.283868066" observedRunningTime="2026-04-21 02:02:24.354336735 +0000 UTC m=+724.124298907" watchObservedRunningTime="2026-04-21 02:02:24.356287966 +0000 UTC m=+724.126250141" Apr 21 02:02:27.350966 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:27.350931 2568 generic.go:358] "Generic (PLEG): container finished" podID="c42c4647-9994-4c36-8b7e-592e82fab77c" containerID="328c174f649c1b22ff0572f8bc6681a287df3236fba5a40e029c400fb11d7c12" exitCode=0 Apr 21 02:02:27.351340 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:27.351007 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" event={"ID":"c42c4647-9994-4c36-8b7e-592e82fab77c","Type":"ContainerDied","Data":"328c174f649c1b22ff0572f8bc6681a287df3236fba5a40e029c400fb11d7c12"} Apr 21 02:02:28.356888 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:28.356857 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" event={"ID":"c42c4647-9994-4c36-8b7e-592e82fab77c","Type":"ContainerStarted","Data":"ba77805000b33f1800d68803c4a25f5f1b77ef1d23acfcacb4f76dd39a2fa3de"} Apr 21 02:02:28.357365 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:28.357069 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:28.374228 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:28.374183 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" podStartSLOduration=7.199596332 podStartE2EDuration="7.374170441s" podCreationTimestamp="2026-04-21 02:02:21 +0000 UTC" firstStartedPulling="2026-04-21 02:02:27.351705136 +0000 UTC m=+727.121667289" lastFinishedPulling="2026-04-21 02:02:27.526279242 +0000 UTC m=+727.296241398" observedRunningTime="2026-04-21 02:02:28.371730581 +0000 UTC m=+728.141692756" watchObservedRunningTime="2026-04-21 02:02:28.374170441 +0000 UTC m=+728.144132615" Apr 21 02:02:33.348106 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.348071 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh" Apr 21 02:02:33.350497 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.350478 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2"] Apr 21 02:02:33.374710 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.374682 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2"] Apr 21 02:02:33.374874 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.374818 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.376990 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.376966 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 02:02:33.484153 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.484119 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.484336 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.484170 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.484336 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.484258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkf6m\" (UniqueName: \"kubernetes.io/projected/6fe11027-c13b-4c8b-b770-2c4938867237-kube-api-access-gkf6m\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.484465 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.484385 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.484465 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.484439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe11027-c13b-4c8b-b770-2c4938867237-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.484558 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.484471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.585663 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.585634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkf6m\" (UniqueName: \"kubernetes.io/projected/6fe11027-c13b-4c8b-b770-2c4938867237-kube-api-access-gkf6m\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.585835 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.585675 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.585835 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.585780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe11027-c13b-4c8b-b770-2c4938867237-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.585835 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.585811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.585983 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.585913 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.585983 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.585970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.586274 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.586247 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.586387 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.586278 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.586387 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.586340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.588027 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.588009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6fe11027-c13b-4c8b-b770-2c4938867237-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.588411 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.588391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe11027-c13b-4c8b-b770-2c4938867237-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.592886 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.592858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkf6m\" (UniqueName: \"kubernetes.io/projected/6fe11027-c13b-4c8b-b770-2c4938867237-kube-api-access-gkf6m\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-cqzv2\" (UID: \"6fe11027-c13b-4c8b-b770-2c4938867237\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.685422 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.685341 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:33.814024 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:33.813962 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2"] Apr 21 02:02:33.816580 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:02:33.816553 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe11027_c13b_4c8b_b770_2c4938867237.slice/crio-f0682dc671122848b483889b8c700221245af0ed33f9555cbd3aa9932c0e3cee WatchSource:0}: Error finding container f0682dc671122848b483889b8c700221245af0ed33f9555cbd3aa9932c0e3cee: Status 404 returned error can't find the container with id f0682dc671122848b483889b8c700221245af0ed33f9555cbd3aa9932c0e3cee Apr 21 02:02:34.379483 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.379444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" event={"ID":"6fe11027-c13b-4c8b-b770-2c4938867237","Type":"ContainerStarted","Data":"827da68749db90f7d22a74d5fd7c7b364bd683fa6dc937c24498e8f0a6bcceb6"} Apr 21 02:02:34.379483 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.379490 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" event={"ID":"6fe11027-c13b-4c8b-b770-2c4938867237","Type":"ContainerStarted","Data":"f0682dc671122848b483889b8c700221245af0ed33f9555cbd3aa9932c0e3cee"} Apr 21 02:02:34.749121 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.749032 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p"] Apr 21 02:02:34.751705 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.751683 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:34.753776 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.753754 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 02:02:34.761145 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.761123 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p"] Apr 21 02:02:34.898646 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.898605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:34.898906 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.898779 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j7mm\" (UniqueName: \"kubernetes.io/projected/4e05eab3-c898-415d-b93a-e76a3651ad05-kube-api-access-9j7mm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:34.898906 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.898812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:34.898906 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.898838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e05eab3-c898-415d-b93a-e76a3651ad05-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:34.898906 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.898868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:34.899141 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:34.898966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000088 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000251 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j7mm\" (UniqueName: \"kubernetes.io/projected/4e05eab3-c898-415d-b93a-e76a3651ad05-kube-api-access-9j7mm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000251 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000251 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000136 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e05eab3-c898-415d-b93a-e76a3651ad05-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000251 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000251 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000558 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000463 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000558 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.000558 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.000543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.002412 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.002388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e05eab3-c898-415d-b93a-e76a3651ad05-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.002740 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.002724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e05eab3-c898-415d-b93a-e76a3651ad05-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.007712 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.007689 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j7mm\" (UniqueName: \"kubernetes.io/projected/4e05eab3-c898-415d-b93a-e76a3651ad05-kube-api-access-9j7mm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-rh27p\" (UID: \"4e05eab3-c898-415d-b93a-e76a3651ad05\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.062717 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.062689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:35.228987 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.228956 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p"] Apr 21 02:02:35.232227 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:02:35.232198 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e05eab3_c898_415d_b93a_e76a3651ad05.slice/crio-9c2398426c68fc39bb4c5e54faa52686cd79a818a72a7864780b5f77af104793 WatchSource:0}: Error finding container 9c2398426c68fc39bb4c5e54faa52686cd79a818a72a7864780b5f77af104793: Status 404 returned error can't find the container with id 9c2398426c68fc39bb4c5e54faa52686cd79a818a72a7864780b5f77af104793 Apr 21 02:02:35.360621 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.360590 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2" Apr 21 02:02:35.385715 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.385631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" event={"ID":"4e05eab3-c898-415d-b93a-e76a3651ad05","Type":"ContainerStarted","Data":"eaf5876d4c2d8984553eee305ef332d70681857066a178d127e7ab63908b6c1c"} Apr 21 02:02:35.385715 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:35.385684 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" event={"ID":"4e05eab3-c898-415d-b93a-e76a3651ad05","Type":"ContainerStarted","Data":"9c2398426c68fc39bb4c5e54faa52686cd79a818a72a7864780b5f77af104793"} Apr 21 02:02:39.374913 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:39.374886 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq" Apr 21 02:02:39.403737 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:39.403692 2568 generic.go:358] "Generic (PLEG): container finished" podID="6fe11027-c13b-4c8b-b770-2c4938867237" containerID="827da68749db90f7d22a74d5fd7c7b364bd683fa6dc937c24498e8f0a6bcceb6" exitCode=0 Apr 21 02:02:39.403958 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:39.403767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" event={"ID":"6fe11027-c13b-4c8b-b770-2c4938867237","Type":"ContainerDied","Data":"827da68749db90f7d22a74d5fd7c7b364bd683fa6dc937c24498e8f0a6bcceb6"} Apr 21 02:02:40.409290 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:40.409260 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" event={"ID":"6fe11027-c13b-4c8b-b770-2c4938867237","Type":"ContainerStarted","Data":"4aff41d5ac697ee25eefdb494d0eeb9caefecb3eb30001786945807ee8ba84be"} Apr 21 02:02:40.409806 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:40.409493 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:40.428260 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:40.428213 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" podStartSLOduration=7.263752147 podStartE2EDuration="7.428198207s" podCreationTimestamp="2026-04-21 02:02:33 +0000 UTC" firstStartedPulling="2026-04-21 02:02:39.404459056 +0000 UTC m=+739.174421209" lastFinishedPulling="2026-04-21 02:02:39.568905115 +0000 UTC m=+739.338867269" observedRunningTime="2026-04-21 02:02:40.425404598 +0000 UTC m=+740.195366772" watchObservedRunningTime="2026-04-21 02:02:40.428198207 +0000 UTC m=+740.198160444" Apr 21 02:02:44.424695 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:44.424659 2568 generic.go:358] "Generic (PLEG): container finished" podID="4e05eab3-c898-415d-b93a-e76a3651ad05" containerID="eaf5876d4c2d8984553eee305ef332d70681857066a178d127e7ab63908b6c1c" exitCode=0 Apr 21 02:02:44.425161 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:44.424738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" event={"ID":"4e05eab3-c898-415d-b93a-e76a3651ad05","Type":"ContainerDied","Data":"eaf5876d4c2d8984553eee305ef332d70681857066a178d127e7ab63908b6c1c"} Apr 21 02:02:45.432356 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:45.432289 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" event={"ID":"4e05eab3-c898-415d-b93a-e76a3651ad05","Type":"ContainerStarted","Data":"30b2006e771b54e2068a660d44fda8b3c666e4692dca27fc1d1a4cda67a30948"} Apr 21 02:02:45.432760 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:45.432522 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:45.454811 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:45.454709 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" podStartSLOduration=11.290096675000001 podStartE2EDuration="11.454682075s" podCreationTimestamp="2026-04-21 02:02:34 +0000 UTC" firstStartedPulling="2026-04-21 02:02:44.425513936 +0000 UTC m=+744.195476088" lastFinishedPulling="2026-04-21 02:02:44.590099331 +0000 UTC m=+744.360061488" observedRunningTime="2026-04-21 02:02:45.449794519 +0000 UTC m=+745.219756686" watchObservedRunningTime="2026-04-21 02:02:45.454682075 +0000 UTC m=+745.224644250" Apr 21 02:02:49.355884 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.355846 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z"] Apr 21 02:02:49.358920 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.358897 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.361060 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.361037 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 02:02:49.367191 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.366883 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z"] Apr 21 02:02:49.442287 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.442260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72e02ddf-2fca-414a-b5b5-2900567c181e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.442467 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.442397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.442467 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.442430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.442548 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.442483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.442548 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.442533 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.442622 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.442554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svkrj\" (UniqueName: \"kubernetes.io/projected/72e02ddf-2fca-414a-b5b5-2900567c181e-kube-api-access-svkrj\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543485 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543441 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543485 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543722 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543515 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543722 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543722 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543575 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svkrj\" (UniqueName: \"kubernetes.io/projected/72e02ddf-2fca-414a-b5b5-2900567c181e-kube-api-access-svkrj\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543722 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72e02ddf-2fca-414a-b5b5-2900567c181e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.543956 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.544023 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.544023 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.543989 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.545719 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.545696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72e02ddf-2fca-414a-b5b5-2900567c181e-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.545993 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.545976 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72e02ddf-2fca-414a-b5b5-2900567c181e-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.550971 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.550950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svkrj\" (UniqueName: \"kubernetes.io/projected/72e02ddf-2fca-414a-b5b5-2900567c181e-kube-api-access-svkrj\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z\" (UID: \"72e02ddf-2fca-414a-b5b5-2900567c181e\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.670872 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.670799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:49.792339 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:49.792287 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z"] Apr 21 02:02:49.794411 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:02:49.794377 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e02ddf_2fca_414a_b5b5_2900567c181e.slice/crio-5c3db524a13891864e0dadbf1493150c923b3fc21c6b9ae02133daa1fc2f7050 WatchSource:0}: Error finding container 5c3db524a13891864e0dadbf1493150c923b3fc21c6b9ae02133daa1fc2f7050: Status 404 returned error can't find the container with id 5c3db524a13891864e0dadbf1493150c923b3fc21c6b9ae02133daa1fc2f7050 Apr 21 02:02:50.452239 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:50.452195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" event={"ID":"72e02ddf-2fca-414a-b5b5-2900567c181e","Type":"ContainerStarted","Data":"cc43ebfe3216117624f59fbfb54c581755f05423b01180247ccccf05796874f5"} Apr 21 02:02:50.452239 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:50.452242 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" event={"ID":"72e02ddf-2fca-414a-b5b5-2900567c181e","Type":"ContainerStarted","Data":"5c3db524a13891864e0dadbf1493150c923b3fc21c6b9ae02133daa1fc2f7050"} Apr 21 02:02:51.432869 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:51.432832 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-cqzv2" Apr 21 02:02:55.471114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:55.471027 2568 generic.go:358] "Generic (PLEG): container finished" podID="72e02ddf-2fca-414a-b5b5-2900567c181e" containerID="cc43ebfe3216117624f59fbfb54c581755f05423b01180247ccccf05796874f5" exitCode=0 Apr 21 02:02:55.471114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:55.471096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" event={"ID":"72e02ddf-2fca-414a-b5b5-2900567c181e","Type":"ContainerDied","Data":"cc43ebfe3216117624f59fbfb54c581755f05423b01180247ccccf05796874f5"} Apr 21 02:02:56.448967 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:56.448929 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-rh27p" Apr 21 02:02:56.476531 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:56.476501 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" event={"ID":"72e02ddf-2fca-414a-b5b5-2900567c181e","Type":"ContainerStarted","Data":"b82773ede691fe218b30f39f7c0ea5032119aa7da1927f39fe25d2135304db00"} Apr 21 02:02:56.476968 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:56.476799 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:02:56.496035 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:02:56.495984 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" podStartSLOduration=7.145799788 podStartE2EDuration="7.495973247s" podCreationTimestamp="2026-04-21 02:02:49 +0000 UTC" firstStartedPulling="2026-04-21 02:02:55.471743452 +0000 UTC m=+755.241705604" lastFinishedPulling="2026-04-21 02:02:55.821916898 +0000 UTC m=+755.591879063" observedRunningTime="2026-04-21 02:02:56.494556045 +0000 UTC m=+756.264518231" watchObservedRunningTime="2026-04-21 02:02:56.495973247 +0000 UTC m=+756.265935421" Apr 21 02:03:07.493327 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:03:07.493281 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z" Apr 21 02:04:28.315255 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:28.315227 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-c78dcbcb5-gdqr6_03e4ec52-d66b-4e41-b9c4-8e26ed7b288b/manager/0.log" Apr 21 02:04:28.534624 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:28.534594 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-64bbc69db5-55468_228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95/manager/0.log" Apr 21 02:04:30.264327 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:30.264283 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-lq9cx_2071548b-e9d2-4389-851f-3b7520e5793f/manager/0.log" Apr 21 02:04:30.376540 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:30.376506 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-t8lwh_fd19c6b1-9354-4bea-bb71-96ba67635021/manager/0.log" Apr 21 02:04:30.590645 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:30.590617 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wwjvh_1d54a962-a45f-4db5-b8db-7a54953f9142/registry-server/0.log" Apr 21 02:04:30.933928 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:30.933848 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-d4cbl_91a20877-a6ca-4350-a13e-d6cbbba878d2/manager/0.log" Apr 21 02:04:31.264198 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:31.264123 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb_d0d0a9bb-7444-42ba-952e-886dda685898/istio-proxy/0.log" Apr 21 02:04:32.153958 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.153933 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2_657fe588-2dc9-401b-812e-6fee3fa8133a/storage-initializer/0.log" Apr 21 02:04:32.160227 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.160206 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-dr9t2_657fe588-2dc9-401b-812e-6fee3fa8133a/main/0.log" Apr 21 02:04:32.270636 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.270606 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z_72e02ddf-2fca-414a-b5b5-2900567c181e/storage-initializer/0.log" Apr 21 02:04:32.277626 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.277603 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-tcg4z_72e02ddf-2fca-414a-b5b5-2900567c181e/main/0.log" Apr 21 02:04:32.387141 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.387112 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-cqzv2_6fe11027-c13b-4c8b-b770-2c4938867237/storage-initializer/0.log" Apr 21 02:04:32.393655 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.393637 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-cqzv2_6fe11027-c13b-4c8b-b770-2c4938867237/main/0.log" Apr 21 02:04:32.500834 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.500762 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh_48e1075e-4fa4-45ff-99c6-6f8bb616c7b9/main/0.log" Apr 21 02:04:32.506579 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.506561 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccx4bvh_48e1075e-4fa4-45ff-99c6-6f8bb616c7b9/storage-initializer/0.log" Apr 21 02:04:32.614866 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.614838 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq_c42c4647-9994-4c36-8b7e-592e82fab77c/storage-initializer/0.log" Apr 21 02:04:32.622195 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.622173 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-c7bsq_c42c4647-9994-4c36-8b7e-592e82fab77c/main/0.log" Apr 21 02:04:32.733355 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.733327 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-rh27p_4e05eab3-c898-415d-b93a-e76a3651ad05/storage-initializer/0.log" Apr 21 02:04:32.739930 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:32.739911 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-rh27p_4e05eab3-c898-415d-b93a-e76a3651ad05/main/0.log" Apr 21 02:04:39.305696 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:39.305667 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-47fz2_3d6924f9-e30d-4086-95a0-8d525b528e62/global-pull-secret-syncer/0.log" Apr 21 02:04:39.479023 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:39.478990 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rtnjq_899e33b4-0828-4733-86e4-56750cb8ec32/konnectivity-agent/0.log" Apr 21 02:04:39.541965 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:39.541936 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-35.ec2.internal_26365148514fa10173edd0155a08a3fb/haproxy/0.log" Apr 21 02:04:43.546859 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:43.546783 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-lq9cx_2071548b-e9d2-4389-851f-3b7520e5793f/manager/0.log" Apr 21 02:04:43.570192 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:43.570162 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-t8lwh_fd19c6b1-9354-4bea-bb71-96ba67635021/manager/0.log" Apr 21 02:04:43.627369 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:43.627347 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wwjvh_1d54a962-a45f-4db5-b8db-7a54953f9142/registry-server/0.log" Apr 21 02:04:43.719555 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:43.719521 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-d4cbl_91a20877-a6ca-4350-a13e-d6cbbba878d2/manager/0.log" Apr 21 02:04:45.091321 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.091281 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tbrdm_00c23ae5-f0a8-414b-9e12-1dfa9725e21a/cluster-monitoring-operator/0.log" Apr 21 02:04:45.215889 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.215812 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-gwb4q_4dd276dd-7cf5-4018-bea5-741cfedf9db9/monitoring-plugin/0.log" Apr 21 02:04:45.248600 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.248577 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ktwrd_a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1/node-exporter/0.log" Apr 21 02:04:45.272640 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.272616 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ktwrd_a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1/kube-rbac-proxy/0.log" Apr 21 02:04:45.297816 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.297794 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ktwrd_a31f1fd7-9a55-4c9b-8bdf-c09c4290b0c1/init-textfile/0.log" Apr 21 02:04:45.572331 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.572240 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/prometheus/0.log" Apr 21 02:04:45.593496 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.593469 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/config-reloader/0.log" Apr 21 02:04:45.614521 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.614503 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/thanos-sidecar/0.log" Apr 21 02:04:45.636206 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.636186 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/kube-rbac-proxy-web/0.log" Apr 21 02:04:45.658192 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.658170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/kube-rbac-proxy/0.log" Apr 21 02:04:45.682555 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.682528 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/kube-rbac-proxy-thanos/0.log" Apr 21 02:04:45.706372 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:45.706354 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_19cd4ebd-4136-4d50-af9b-5f703d01d7c8/init-config-reloader/0.log" Apr 21 02:04:47.213665 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:47.213637 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-cjcd4_b5c53bee-bb5b-4e22-8b9a-eb988c725638/networking-console-plugin/0.log" Apr 21 02:04:47.705504 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:47.705475 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/1.log" Apr 21 02:04:47.710619 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:47.710598 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx5mm_e91998d5-ea6d-4d46-8984-013ce4758689/console-operator/2.log" Apr 21 02:04:48.045252 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.045170 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn"] Apr 21 02:04:48.049040 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.049018 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.051016 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.050993 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5xmbv\"/\"default-dockercfg-whz96\"" Apr 21 02:04:48.051990 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.051971 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5xmbv\"/\"openshift-service-ca.crt\"" Apr 21 02:04:48.052104 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.051971 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5xmbv\"/\"kube-root-ca.crt\"" Apr 21 02:04:48.056966 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.056938 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn"] Apr 21 02:04:48.237782 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.237743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-lib-modules\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.238114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.237799 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftqrl\" (UniqueName: \"kubernetes.io/projected/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-kube-api-access-ftqrl\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.238114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.237829 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-podres\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.238114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.237868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-proc\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.238114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.237912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-sys\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.238114 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.237946 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-hjxmg_98d2bf0f-70f2-458d-8888-7120eb19ed23/download-server/0.log" Apr 21 02:04:48.338713 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftqrl\" (UniqueName: \"kubernetes.io/projected/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-kube-api-access-ftqrl\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.338884 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-podres\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.338884 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338779 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-proc\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.338884 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-sys\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.339040 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338898 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-lib-modules\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.339040 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338897 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-proc\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.339040 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-podres\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.339040 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.338952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-sys\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.339040 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.339007 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-lib-modules\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.345694 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.345677 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftqrl\" (UniqueName: \"kubernetes.io/projected/e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc-kube-api-access-ftqrl\") pod \"perf-node-gather-daemonset-54rjn\" (UID: \"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.359344 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.359325 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.484208 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.484182 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn"] Apr 21 02:04:48.485696 ip-10-0-141-35 kubenswrapper[2568]: W0421 02:04:48.485666 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode7fa5c5a_3ccc_4e5d_82ef_6f815e6eb4bc.slice/crio-cafad88eb4d5781ee26644a7c1a6dd7344ca9510c6512b878f77159d7fbf3def WatchSource:0}: Error finding container cafad88eb4d5781ee26644a7c1a6dd7344ca9510c6512b878f77159d7fbf3def: Status 404 returned error can't find the container with id cafad88eb4d5781ee26644a7c1a6dd7344ca9510c6512b878f77159d7fbf3def Apr 21 02:04:48.731632 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.731551 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-f65qv_b78c9a0f-a030-444c-886f-a49679306c25/volume-data-source-validator/0.log" Apr 21 02:04:48.881558 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.881525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" event={"ID":"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc","Type":"ContainerStarted","Data":"fbdcabe5242f64b2bd8eb66c9685b8874629bfe9130d8e073f582cabeba2ba7b"} Apr 21 02:04:48.881558 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.881560 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" event={"ID":"e7fa5c5a-3ccc-4e5d-82ef-6f815e6eb4bc","Type":"ContainerStarted","Data":"cafad88eb4d5781ee26644a7c1a6dd7344ca9510c6512b878f77159d7fbf3def"} Apr 21 02:04:48.881747 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.881631 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:48.894613 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:48.894574 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" podStartSLOduration=0.894562718 podStartE2EDuration="894.562718ms" podCreationTimestamp="2026-04-21 02:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:04:48.894360586 +0000 UTC m=+868.664322763" watchObservedRunningTime="2026-04-21 02:04:48.894562718 +0000 UTC m=+868.664524893" Apr 21 02:04:49.532459 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:49.532431 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fjfcj_82bc4cd8-ea54-4a2a-ae0f-172581c8dace/dns/0.log" Apr 21 02:04:49.552829 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:49.552806 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fjfcj_82bc4cd8-ea54-4a2a-ae0f-172581c8dace/kube-rbac-proxy/0.log" Apr 21 02:04:49.688250 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:49.688219 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-shp9g_d406d2d2-81f8-42f9-bc0e-baf9d5cdccc8/dns-node-resolver/0.log" Apr 21 02:04:50.247323 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:50.247284 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8654d455dc-jnl96_5f58d8d6-b76a-47fa-b63b-84a55e51e3e8/registry/0.log" Apr 21 02:04:50.265104 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:50.265081 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bv5lq_94b3f448-6380-4226-b329-a7e8b2cad657/node-ca/0.log" Apr 21 02:04:51.057267 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:51.057230 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfwmtbb_d0d0a9bb-7444-42ba-952e-886dda685898/istio-proxy/0.log" Apr 21 02:04:51.672356 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:51.672300 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9zg7k_7d936ad0-666c-49cb-8f95-d04607cc5b52/serve-healthcheck-canary/0.log" Apr 21 02:04:52.122552 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:52.122520 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-cqf6k_35c0ff07-3052-4608-8a7c-4b86babf4ea2/insights-operator/1.log" Apr 21 02:04:52.123075 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:52.122732 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-cqf6k_35c0ff07-3052-4608-8a7c-4b86babf4ea2/insights-operator/0.log" Apr 21 02:04:52.256043 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:52.255996 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pbql6_7c4e2249-a98a-4f0c-b7c6-84207d6db519/kube-rbac-proxy/0.log" Apr 21 02:04:52.274150 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:52.274119 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pbql6_7c4e2249-a98a-4f0c-b7c6-84207d6db519/exporter/0.log" Apr 21 02:04:52.293263 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:52.293226 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pbql6_7c4e2249-a98a-4f0c-b7c6-84207d6db519/extractor/0.log" Apr 21 02:04:54.218912 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:54.218869 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-c78dcbcb5-gdqr6_03e4ec52-d66b-4e41-b9c4-8e26ed7b288b/manager/0.log" Apr 21 02:04:54.266477 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:54.266446 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-64bbc69db5-55468_228b1e7c-a37b-4b2d-bdb3-cd860c6e5a95/manager/0.log" Apr 21 02:04:54.894985 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:54.894955 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-54rjn" Apr 21 02:04:55.536067 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:55.536039 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bf8b8945f-c52mb_931c8473-29e3-410c-8ac1-f093c5626f55/manager/0.log" Apr 21 02:04:55.581543 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:55.581510 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-8g4rz_07860c32-39e8-4073-8b12-b849eec6c664/openshift-lws-operator/0.log" Apr 21 02:04:59.816165 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:59.816088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sp8qx_c50383dd-d7ca-429b-a12e-4dea1a2761b8/migrator/0.log" Apr 21 02:04:59.834318 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:04:59.834278 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sp8qx_c50383dd-d7ca-429b-a12e-4dea1a2761b8/graceful-termination/0.log" Apr 21 02:05:00.155000 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:00.154968 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-75sps_fca9b761-dba2-40df-99e6-41e04c0a7ffb/kube-storage-version-migrator-operator/1.log" Apr 21 02:05:00.155893 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:00.155870 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-75sps_fca9b761-dba2-40df-99e6-41e04c0a7ffb/kube-storage-version-migrator-operator/0.log" Apr 21 02:05:01.471284 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.471256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/kube-multus-additional-cni-plugins/0.log" Apr 21 02:05:01.489915 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.489892 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/egress-router-binary-copy/0.log" Apr 21 02:05:01.507404 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.507382 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/cni-plugins/0.log" Apr 21 02:05:01.526130 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.526106 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/bond-cni-plugin/0.log" Apr 21 02:05:01.545716 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.545698 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/routeoverride-cni/0.log" Apr 21 02:05:01.564054 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.564034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/whereabouts-cni-bincopy/0.log" Apr 21 02:05:01.582231 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.582205 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wzkbb_446458a6-4d58-4666-88b6-92203ea344ee/whereabouts-cni/0.log" Apr 21 02:05:01.663048 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.663021 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rmwq4_4d5eb879-dd29-4c7f-8643-6f9a6b561eda/kube-multus/0.log" Apr 21 02:05:01.761570 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.761507 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pqvmq_333616b1-f960-4eb6-b4fd-448534b9cd3a/network-metrics-daemon/0.log" Apr 21 02:05:01.777954 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:01.777932 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-pqvmq_333616b1-f960-4eb6-b4fd-448534b9cd3a/kube-rbac-proxy/0.log" Apr 21 02:05:02.556715 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.556679 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-controller/0.log" Apr 21 02:05:02.572001 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.571977 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/0.log" Apr 21 02:05:02.576497 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.576478 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovn-acl-logging/1.log" Apr 21 02:05:02.592978 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.592952 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/kube-rbac-proxy-node/0.log" Apr 21 02:05:02.611117 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.611090 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 02:05:02.627445 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.627422 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/northd/0.log" Apr 21 02:05:02.647913 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.647887 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/nbdb/0.log" Apr 21 02:05:02.665581 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.665560 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/sbdb/0.log" Apr 21 02:05:02.754217 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:02.754190 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvzx2_8889bb55-ecc3-4f0f-b6a3-5c5f2e739440/ovnkube-controller/0.log" Apr 21 02:05:04.281907 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:04.281872 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-l6sfq_0021f619-a97e-4a5d-abda-e2dd3c7a3b80/check-endpoints/0.log" Apr 21 02:05:04.348905 ip-10-0-141-35 kubenswrapper[2568]: I0421 02:05:04.348872 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-z86fj_ee736496-b4a2-4832-ab28-516d69f51886/network-check-target-container/0.log"